847 resultados para work in land
Resumo:
This paper presents an extensive review on the services, six-sigma, and application of six-sigma in services. In order to improve service quality focus on service process is necessary. Six-sigma is a philosophy which also concentrates on the improvement of process. So, six-sigma if properly applied can be useful for services. This study focuses on the application aspect of six-sigma to wider range of services. The wider applicability of six-sigma depends on identification of key performance indicators(KPIs) for different types of service processes. A case study is conducted in call center services to identify, analyze and compare critical to quality characteristics (CTQs) and KPIs with other types of services available in literature. This study will be helpful to both practitioners and researchers.
Resumo:
The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. In case of services, health care and finance were major beneficiaries till now. The application of Six Sigma is gradually picking up in other services like; call centers, utilities and public services. This paper provides empirical evidence on Six Sigma implementation in service industries in Singapore. By using a sample size of 50 service organizations (10 responses are from organizations which have implemented Six Sigma), the paper helps in understanding the status of Six Sigma in service organizations in Singapore. The findings confirm the inclusion of critical success factors, critical to quality characteristics, tools and key performance indicators as observed from the literature. The revelation of “not relevant” as a reason for not implementing Six Sigma shows the need for understanding specific requirements of service organizations before its application.
Resumo:
This paper presents a Six Sigma case study analysis involving three service organizations of Singapore. The organizations are a local hospital, a construction and related engineering service, and a consultancy service. These organizations embarked on their Six Sigma journey around 2003-2004. Though the hospital was slightly ahead than the other two in beginning Six Sigma. These organizations have since achieved significant service improvements through implementation of Six Sigma to their different divisions. Through a series of structured interviews with Six Sigma project champions, team leaders, and members; project reports; public archives; and observations; this study explores the Six Sigma journey of these organizations. The results portray a list of success factors which led to the Six Sigma initiatives, the process of Six Sigma implementation through proper identification of critical-to-quality characteristics, tools and techniques, and the performance indicators which display the improvements due to Six Sigma.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
The need for accessible housing in Australia is acute. Both government and the community service sector recognise the importance of well designed accessible housing to optimise the integration of older people and people with disability, to encourage a prudent use of scarce health and community services and to enhance the liveability of our cities. In 2010, the housing industry, negotiated with the Australian Government and community representatives to adopt a nationally consistent voluntary code (Livable Housing Design) and a strategy to provide minimal level of accessibility in all new housing by 2020. Evidence from the implementation of such programs in the United Kingdom and USA, however, serves to question whether this aspirational goal can be achieved through voluntary codes. Minimal demand at the point of new sale, and problems in the production of housing to the required standards have raised questions regarding the application of program principles in the context of a voluntary code. In addressing the latter issue, this paper presents early findings from the analysis of qualitative interviews conducted with developers, builders and designers in various housing contexts. It identifies their “logics in use” in the production of housing in response to Livable Housing Design’s voluntary code and indicates factors that are likely to assist and impede the attainment of the 2020 aspirational goal.
Resumo:
Lighting industry professionals work in an international marketplace and encounter a range of social, geographical and cultural challenges associated with this. Education in lighting should introduce students to aspects of these challenges. To achieve this, an international field trip was recently undertaken that sought to provide an authentic learning experience for students. Twelve Masters of Lighting students from two Australian universities took part in a field trip to Shanghai, China and surrounding areas. The goal was to offer students insight into practical issues in the lighting industry at an international level and to do so in a unique and authentic context. To evaluate the outcomes of the trip, each participant was surveyed afterwards. Benefits were identified in terms of: increased knowledge and insight into manufacturing issues in lighting, experiential learning in lighting design practice not available locally (e.g, master planning), increased understanding of cultural influences in design and enhancing professional contacts within the lighting industry. Field trips may also act as an inverted curriculum experience for new students to engage them and promote learning within a professional context.
Resumo:
We present a novel, web-accessible scientific workflow system which makes large-scale comparative studies accessible without programming or excessive configuration requirements. GPFlow allows a workflow defined on single input values to be automatically lifted to operate over collections of input values and supports the formation and processing of collections of values without the need for explicit iteration constructs. We introduce a new model for collection processing based on key aggregation and slicing which guarantees processing integrity and facilitates automatic association of inputs, allowing scientific users to manage the combinatorial explosion of data values inherent in large scale comparative studies. The approach is demonstrated using a core task from comparative genomics, and builds upon our previous work in supporting combined interactive and batch operation, through a lightweight web-based user interface.
Resumo:
Automatic species recognition plays an important role in assisting ecologists to monitor the environment. One critical issue in this research area is that software developers need prior knowledge of specific targets people are interested in to build templates for these targets. This paper proposes a novel approach for automatic species recognition based on generic knowledge about acoustic events to detect species. Acoustic component detection is the most critical and fundamental part of this proposed approach. This paper gives clear definitions of acoustic components and presents three clustering algorithms for detecting four acoustic components in sound recordings; whistles, clicks, slurs, and blocks. The experiment result demonstrates that these acoustic component recognisers have achieved high precision and recall rate.
Resumo:
Six Sigma is considered to be an important management philosophy to obtain satisfied customers. But financial service organisations have been slow to adopt Six Sigma issues so far. Despite the extensive effort that has been invested and benefits that can be obtained, the systematic implementation of Six Sigma in financial service organisations is limited. As a company wide implementation framework is missing so far, this paper tries to fill this gap. Based on theory, a conceptual framework is developed and evaluated by experts from financial institutions. The results show that it is very important to link Six Sigma with the strategic as well as the operations level. Furthermore, although Six Sigma is a very important method for improving quality of processes others such as Lean Management are also used This requires a superior project portfolio management to coordinate resources and projects of Six Sigma with the other methods used. Beside the theoretical contribution, the framework can be used by financial service companies to evaluate their Six Sigma activities. Thus, the framework grounded through literature and empirical data will be a useful guide for sustainable and successful implementation of a Six Sigma initiative in financial service organisations.
Resumo:
Non-invasive vibration analysis has been used extensively to monitor the progression of dental implant healing and stabilization. It is now being considered as a method to monitor femoral implants in transfemoral amputees. This paper evaluates two modal analysis excitation methods and investigates their capabilities in detecting changes at the interface between the implant and the bone that occur during osseointegration. Excitation of bone-implant physical models with the electromagnetic shaker provided higher coherence values and a greater number of modes over the same frequency range when compared to the impact hammer. Differences were detected in the natural frequencies and fundamental mode shape of the model when the fit of the implant was altered in the bone. The ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.
Resumo:
This paper introduces our research on influencing the experience of people in urban public places through mobile mediated interactions. Information and communication technology (ICT) devices are sometimes used to create personal space while in public. ICT devices could also be utilised to digitally augment the urban space with non-privacy sensitive data enabling mobile mediated interactions in an anonymous way between collocated strangers. We present what motivates the research on digital augmentations and mobile mediated interactions between unknown urban dwellers, define the research problem that drives this study and why it is significant research in the field of pervasive social networking. The paper illustrates three design interventions enabling social pervasive content sharing and employing pervasive presence, awareness and anonymous social user interaction in urban public places. The paper concludes with an outlook and summarises the research effort.
Resumo:
The Lingodroids are a pair of mobile robots that evolve a language for places and relationships between places (based on distance and direction). Each robot in these studies has its own understanding of the layout of the world, based on its unique experiences and exploration of the environment. Despite having different internal representations of the world, the robots are able to develop a common lexicon for places, and then use simple sentences to explain and understand relationships between places even places that they could not physically experience, such as areas behind closed doors. By learning the language, the robots are able to develop representations for places that are inaccessible to them, and later, when the doors are opened, use those representations to perform goal-directed behavior.
Resumo:
The main aim of this thesis is to analyse and optimise a public hospital Emergency Department. The Emergency Department (ED) is a complex system with limited resources and a high demand for these resources. Adding to the complexity is the stochastic nature of almost every element and characteristic in the ED. The interaction with other functional areas also complicates the system as these areas have a huge impact on the ED and the ED is powerless to change them. Therefore it is imperative that OR be applied to the ED to improve the performance within the constraints of the system. The main characteristics of the system to optimise included tardiness, adherence to waiting time targets, access block and length of stay. A validated and verified simulation model was built to model the real life system. This enabled detailed analysis of resources and flow without disruption to the actual ED. A wide range of different policies for the ED and a variety of resources were able to be investigated. Of particular interest was the number and type of beds in the ED and also the shift times of physicians. One point worth noting was that neither of these resources work in isolation and for optimisation of the system both resources need to be investigated in tandem. The ED was likened to a flow shop scheduling problem with the patients and beds being synonymous with the jobs and machines typically found in manufacturing problems. This enabled an analytic scheduling approach. Constructive heuristics were developed to reactively schedule the system in real time and these were able to improve the performance of the system. Metaheuristics that optimised the system were also developed and analysed. An innovative hybrid Simulated Annealing and Tabu Search algorithm was developed that out-performed both simulated annealing and tabu search algorithms by combining some of their features. The new algorithm achieves a more optimal solution and does so in a shorter time.
Resumo:
The growth of technologies and tools branded as =new media‘ or =Web 2.0‘ has sparked much discussion about the internet and its place in all facets of social life. Such debate includes the potential for blogs and citizen journalism projects to replace or alter journalism and mainstream media practices. However, while the journalism-blog dynamic has attracted the most attention, the actual work of political bloggers, the roles they play in the mediasphere and the resources they use, has been comparatively ignored. This project will look at political blogging in Australia and France - sites commenting on or promoting political events and ideas, and run by citizens, politicians, and journalists alike. In doing so, the structure of networks formed by bloggers and the nature of communication within political blogospheres will be examined. Previous studies of political blogging around the world have focussed on individual nations, finding that in some cases the networks are divided between different political ideologies. By comparing two countries with different political representation (two-party dominated system vs. a wider political spectrum), this study will determine the structure of these political blogospheres, and correlate these structures with the political environment in which they are situated. The thesis adapts concepts from communication and media theories, including framing, agenda setting, and opinion leaders, to examine the work of political bloggers and their place within the mediasphere. As well as developing a hybrid theoretical base for research into blogs and other online communication, the project outlines new methodologies for carrying out studies of online activity through the analysis of several topical networks within the wider activity collected for this project. The project draws on hyperlink and textual data collected from a sample of Australian and French blogs between January and August 2009. From this data, the thesis provides an overview of =everyday‘ political blogging, showing posting patterns over several months of activity, away from national elections and their associated campaigns. However, while other work in this field has looked solely at cumulative networks, treating collected data as a static network, this project will also look at specific cases to see how the blogospheres change with time and topics of discussion. Three case studies are used within the thesis to examine how blogs cover politics, featuring an international political event (the Obama inauguration), and local political topics (the opposition to the =Création et Internet‘, or HADOPI, law in France, the =Utegate‘ scandal in Australia). By using a mixture of qualitative and quantitative methods, the study analyses data collected from a population of sites from both countries, looking at their linking patterns, relationship with mainstream media, and topics of interest. This project will subsequently help to further develop methodologies in this field and provide new and detailed information on both online networks and internet-based political communication in Australia and France.