768 resultados para Phenomenological approach
Resumo:
Dementia is an irreversible and incurable syndrome that leads to progressive impairment of cognitive functions and behavioural and psychological symptoms such as agitation, depression and psychosis. Appropriate environmental conditions can help delay its onset and progression, and indoor environmental (IE) factors have a major impact. However, there is no firm understanding of the full range of relevant IE factors and their impact levels. This paper describes a preliminary study to investigate the effects of IE on Hong Kong residential care homes (RCH) dementia residents. This involved six purposively selected focus groups, each comprising the main stakeholders of the dementia residents’ caregivers, RCH staff and/or registered nurses, and architects. Using the Critical Incident Technique, the main context and experiences of behavioural problems of dementia residents caused by IE were explored and the key causal RCH IE quality factors identified, together with the associated responses and stress levels involved. The findings indicate that the acoustic environment, lighting and thermal environment are the most important influencing factors. Many of the remedies provided by the focus groups are quite simple to carry out and are summarised in the form of recommendations to current RCHs providers and users. The knowledge acquired in this initial study will help enrich the knowledge of IE design for dementiaspecific residential facilities. It also provides some preliminary insights for healthcare policymakers and practitioners in the building design/facilities management and dementia-care sectors into the IE factors contributing to a more comfortable, healthy and sustainable RCH living environment in Hong Kong.
Resumo:
Objectives PEPA is funded by the Department of Health and Ageing and aims to further improve the skill and confidence of the generalist workforce to work with people with palliative care needs. Recent quality improvement initiatives to promote transfer of learning into practice include appointment of a clinical educator, implementation of an online module for mentors and delivery of a mentoring workshop (collaborating with NSAP and PCC4U). This paper presents an overview of outcomes from these quality improvement initiatives. Methods PEPA host sites are selected based on their specialist palliative care level. Host site managers are surveyed six-monthly and participants are surveyed pre and three months post-placement to collect open and fixed response data on their experience of the program. Participants in the mentoring workshop (n=39) were asked to respond to a survey regarding the workshop outcomes. Results The percentage of placement participants who strongly agreed they ‘have the ability to implement the interventions required for people who have a life-limiting illness’ increased from 35% in 2011 (n=34) to 51% in 2012 (n=91) post-placement. Responses from mentor workshop participants indicated that 76% of respondents (n=25) agreed that they were able to identify principles for mentoring in the context of palliative care. In 2012, 61% of host site managers (n=54) strongly agreed that PEPA supports clinician working with people with a life-limiting illness. Conclusion Strategies to build the capabilities of palliative care professionals to mentor and support the learning experience of PEPA participants are critical to ongoing improvements of the program.
Resumo:
Lean strategies have been developed to eliminate or reduce waste and thus improve operational efficiency in a manufacturing environment. However, in practice, manufacturers encounter difficulties to select appropriate lean strategies within their resource constraints and to quantitatively evaluate the perceived value of manufacturing waste reduction. This paper presents a methodology developed to quantitatively evaluate the contribution of lean strategies selected to reduce manufacturing wastes within the manufacturers’ resource (time) constraints. A mathematical model has been developed for evaluating the perceived value of lean strategies to manufacturing waste reduction and a step-by-step methodology is provided for selecting appropriate lean strategies to improve the manufacturing performance within their resource constraints. A computer program is developed in MATLAB for finding the optimum solution. With the help of a case study, the proposed methodology and developed model has been validated. A ‘lean strategy-wastes’ correlation matrix has been proposed to establish the relationship between the manufacturing wastes and lean strategies. Using the correlation matrix and applying the proposed methodology and developed mathematical model, authors came out with optimised perceived value of reduction of a manufacturer's wastes by implementing appropriate lean strategies within a manufacturer's resources constraints. Results also demonstrate that the perceived value of reduction of manufacturing wastes can significantly be changed based on policies and product strategy taken by a manufacturer. The proposed methodology can also be used in dynamic situations by changing the input in the programme developed in MATLAB. By identifying appropriate lean strategies for specific manufacturing wastes, a manufacturer can better prioritise implementation efforts and resources to maximise the success of implementing lean strategies in their organisation.
Resumo:
Why do some people remain lean while others are susceptible to obesity, and why do obese individuals vary in their successes in losing weight? Despite physiological processes that promote satiety and satiation, some individuals are more susceptible to overeating. While the phenomena of susceptibility to weight gain, resistance to treatment or weight loss, and individual variability are not novel, they have yet to be exploited and systematically examined to better understand how to characterise phenotypes of obesity. The identification and characterisation of distinct phenotypes not only highlight the heterogeneous nature of obesity but may also help to inform the development of more tailored strategies for the treatment and prevention of obesity. This review examines the evidence for different susceptible phenotypes of obesity that are characterised by risk factors associated with the hedonic and homeostatic systems of appetite control.
Resumo:
Integration of small-scale electricity generators, known as Distributed Generation (DG), into the distribution networks has become increasingly popular at the present. This tendency together with the falling price of synchronous-type generator has potential to give the DG a better chance in participating in the voltage regulation process together with other devices already available in the system. The voltage control issue turns out to be a very challenging problem for the distribution engineers since existing control coordination schemes would need to be reconsidered to take into account the DG operation. In this paper, we propose a control coordination technique, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimizes interaction with other active devices, such as On-load Tap Changing Transformer (OLTC) and voltage regulator. The technique has been developed based on the concept of control zone, Line Drop Compensation (LDC), as well as the choice of controller's parameters. Simulations carried out on an Australian system show that the technique is suitable and flexible for any system with multiple regulating devices including DG.
Resumo:
Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.
Resumo:
Australia lacks a satisfactory, national paradigm for assessing legal capacity in the context of testamentary, enduring power of attorney and advance care directive documents. Capacity assessments are currently conducted on an ad hoc basis by legal and/or medical professionals. The reliability of the assessment process is subject to the skill set and mutual understanding of the legal and/or medical professional conducting the assessment. There is a growth in the prevalence of diseases such as dementia. Such diseases impact upon cognition which increasingly necessitates collaboration between the legal and medical professions when assessing the effect of mentally disabling conditions upon legal capacity. Miscommunication and lack of understanding between legal and medical professionals involved could impede the development of a satisfactory paradigm. This article will discuss legal capacity assessment in Australia and how to strengthen the relationship between legal and medical professionals involved in capacity assessments. The development of a national paradigm would promote consistency and transparency of process, helping to improve the professional relationship and maximising the principles of autonomy, participation and dignity.
Resumo:
This Case Study relates to the creation and implementation of career‐focussed courses in Creative Media for film, television, animation, broadcast and web contexts. The paper examines the advantages and disadvantages of co‐teaching, and how different professional and academic backgrounds and disciplines can productively inform curriculum design and delivery in the academic/professional context. The authors, as co‐creators and co‐lecturers, have developed a number of courses which represent current working models for intermediate to advanced level academic/professional study, and attract students from across the creative disciplines; including theatre, media, visual arts and music. These courses are structured to develop in students a wide range of aesthetic and technical skills, as well as their ability to apply those skills professionally within and across the creative media industries. Issues regarding the balance between academic rigour, practical hands‐on skill development, assessment, logistics, resources, teamwork and other issues, are examined in the paper.
Resumo:
A new community and communication type of social networks - online dating - are gaining momentum. With many people joining in the dating network, users become overwhelmed by choices for an ideal partner. A solution to this problem is providing users with partners recommendation based on their interests and activities. Traditional recommendation methods ignore the users’ needs and provide recommendations equally to all users. In this paper, we propose a recommendation approach that employs different recommendation strategies to different groups of members. A segmentation method using the Gaussian Mixture Model (GMM) is proposed to customize users’ needs. Then a targeted recommendation strategy is applied to each identified segment. Empirical results show that the proposed approach outperforms several existing recommendation methods.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Recent modelling of socio-economic costs by the Australian railway industry in 2010 has estimated the cost of level crossing accidents to exceed AU$116 million annually. To better understand causal factors that contribute to these accidents, the Cooperative Research Centre for Rail Innovation is running a project entitled Baseline Level Crossing Video. The project aims to improve the recording of level crossing safety data by developing an intelligent system capable of detecting near-miss incidents and capturing quantitative data around these incidents. To detect near-miss events at railway level crossings a video analytics module is being developed to analyse video footage obtained from forward-facing cameras installed on trains. This paper presents a vision base approach for the detection of these near-miss events. The video analytics module is comprised of object detectors and a rail detection algorithm, allowing the distance between a detected object and the rail to be determined. An existing publicly available Histograms of Oriented Gradients (HOG) based object detector algorithm is used to detect various types of vehicles in each video frame. As vehicles are usually seen from a sideway view from the cabin’s perspective, the results of the vehicle detector are verified using an algorithm that can detect the wheels of each detected vehicle. Rail detection is facilitated using a projective transformation of the video, such that the forward-facing view becomes a bird’s eye view. Line Segment Detector is employed as the feature extractor and a sliding window approach is developed to track a pair of rails. Localisation of the vehicles is done by projecting the results of the vehicle and rail detectors on the ground plane allowing the distance between the vehicle and rail to be calculated. The resultant vehicle positions and distance are logged to a database for further analysis. We present preliminary results regarding the performance of a prototype video analytics module on a data set of videos containing more than 30 different railway level crossings. The video data is captured from a journey of a train that has passed through these level crossings.
Resumo:
In this paper we focus specifically on explaining variation in core human values, and suggest that individual differences in values can be partially explained by personality traits and the perceived ability to manage emotions in the self and others (i.e. trait emotional intelligence). A sample of 209 university students was used to test hypotheses regarding several proposed direct and indirect relationships between personality traits, trait emotional intelligence and values. Consistent with the hypotheses, Harm Avoidance and Novelty Seeking were found to directly predict Hedonism, Conformity, and Stimulation. Harm Avoidance was also found to indirectly predict these values through the mediating effects of key subscales of trait emotional intelligence. Novelty Seeking was not found to be an indirect predictor of values. Results have implications for our understanding of the relationship between personality, trait emotional intelligence and values, and suggest a common basis in terms of approach and avoidance pathways.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Moreover, several optimization techniques are also proposed to reduce the cost of estimating the confidence of imputation queries at both the tuple-level and the database-level. Experiments based on several real-world data collections demonstrate not only the effectiveness of WebPut compared to existing approaches, but also the efficiency of our proposed algorithms and optimization techniques.