517 resultados para Socially prescribed perfectionism
Resumo:
People in developed countries are living longer with the help of medical advances. Literature has shown that older people prefer to stay independent and live at home for as long as possible. Therefore, it is important to find out how to best accommodate and assist them in maintaining quality of life and independence as well as easing human resources. Researchers have claimed that assistive devices assist in older people’s independence, however, only a small number of studies regarding the efficiency of assistive devices have been undertaken of which several have stated that devices are not being used. The overall aim of this research was to identify whether the disuse and ineffectiveness of assistive devices are related to change in abilities or related to the design of the devices. The objective was to gather information from the elderly; to identify what assistive devices are being used or not used and to gain an understanding on their attitudes towards assistive devices. Research was conducted in two phases. The initial phase of the research was conducted with the distribution of questionnaires to people over the age of fifty that asked general questions and specific questions on type of devices being used. Phase One was followed on by Phase Two, where participants from Phase One who had come in contact with assistive devices were invited to participate in a semi-structured interview. Questions were put forth to the interviewee on their use of and attitudes towards assistive devices. Findings indicated that the reasons for the disuse in assistive devices were mostly design related; bulkiness, reliability, performance of the device, difficulty of use. The other main reason for disuse was socially related; elderly people preferred to undertake activities on their own and only use a device as a precaution or when absolutely necessary. They would prefer not having to rely on the devices. Living situation and difference in gender did not affect the preference for the use of assistive devices over personal assistance. The majority strongly supported the idea of remaining independent for as long as possible. In conclusion, this study proposes that through these findings, product designers will have a better understanding of the requirements of an elderly user. This will enable the designers to produce assistive devices that are more practical, personalised, reliable, easy to use and tie in with the older people’s environments. Additional research with different variables is recommended to further justify these findings.
Resumo:
As a consequence of the increased incidence of collaborative arrangements between firms, the competitive environment characterising many industries has undergone profound change. It is suggested that rivalry is not necessarily enacted by individual firms according to the traditional mechanisms of direct confrontation in factor and product markets, but rather as collaborative orchestration between a number of participants or network members. Strategic networks are recognised as sets of firms within an industry that exhibit denser strategic linkages among themselves than other firms within the same industry. Based on this, strategic networks are determined according to evidence of strategic alliances between firms comprising the industry. As a result, a single strategic network represents a group of firms closely linked according to collaborative ties. Arguably, the collective outcome of these strategic relationships engineered between firms suggest that the collaborative benefits attributed to interorganisational relationships require closer examination in respect to their propensity to influence rivalry in intraindustry environments. Derived in large from the social sciences, network theory allows for the micro and macro examination of the opportunities and constraints inherent in the structure of relationships in strategic networks, establishing a relational approach upon which the conduct and performance of firms can be more fully understood. Research to date has yet to empirically investigate the relationship between strategic networks and rivalry. The limited research that has been completed utilising a network rationale to investigate competitive patterns in contemporary industry environments has been characterised by a failure to directly measure rivalry. Further, this prior research has typically embedded investigation in industry settings dominated by technological or regulatory imperatives, such as the microprocessor and airline industries. These industries, due to the presence of such imperatives, are arguably more inclined to support the realisation of network rivalry, through subscription to prescribed technological standards (eg., microprocessor industry) or by being bound by regulatory constraints dictating operation within particular market segments (airline industry). In order to counter these weaknesses, the proposition guiding research - Are patterns of rivalry predicted by strategic network membership? – is embedded in the United States Light Vehicles Industry, an industry not dominated by technological or regulatory imperatives. Further, rivalry is directly measured and utilised in research, thus distinguishing this investigation from prior research efforts. The timeframe of investigation is 1993 – 1999, with all research data derived from secondary sources. Strategic networks were defined within the United States Light Vehicles Industry based on evidence of horizontal strategic relationships between firms comprising the industry. The measure of rivalry used to directly ascertain the competitive patterns of industry participants was derived from the traditional Herfindahl Index, modified to account for patterns of rivalry observed at the market segment level. Statistical analyses of the strategic network and rivalry constructs found little evidence to support the contention of network rivalry; indeed, greater levels of rivalry were observed between firms comprising the same strategic network than between firms participating in opposing network structures. Based on these results, patterns of rivalry evidenced in the United States Light Vehicle Industry over the period 1993 – 1999 were not found to be predicted by strategic network membership. The findings generated by this research are in contrast to current theorising in the strategic network – rivalry realm. In this respect, these findings are surprising. The relevance of industry type, in conjunction with prevailing network methodology, provides the basis upon which these findings are contemplated. Overall, this study raises some important questions in relation to the relevancy of the network rivalry rationale, establishing a fruitful avenue for further research.
Resumo:
The purpose of this proof-of-concept study was to determine the relevance of direct measurements to monitor the load applied on the osseointegrated fixation of transfemoral amputees during static load bearing exercises. The objectives were (A) to introduce an apparatus using a three-dimensional load transducer, (B) to present a range of derived information relevant to clinicians, (C) to report on the outcomes of a pilot study and (D) to compare the measurements from the transducer with those from the current method using a weighing scale. One transfemoral amputee fitted with an osseointegrated implant was asked to apply 10 kg, 20 kg, 40 kg and 80 kg on the fixation, using self-monitoring with the weighing scale. The loading was directly measured with a portable kinetic system including a six-channel transducer, external interface circuitry and a laptop. As the load prescribed increased from 10 kg to 80 kg, the forces and moments applied on and around the antero-posterior axis increased by 4 fold anteriorly and 14 fold medially, respectively. The forces and moments applied on and around the medio-lateral axis increased by 9 fold laterally and 16 fold from anterior to posterior, respectively. The long axis of the fixation was overloaded and underloaded in 17 % and 83 % of the trials, respectively, by up to ±10 %. This proof-of-concept study presents an apparatus that can be used by clinicians facing the challenge of improving basic knowledge on osseointegration, for the design of equipment for load bearing exercises and for rehabilitation programs.
Resumo:
Network induced delay in networked control systems (NCS) is inherently non-uniformly distributed and behaves with multifractal nature. However, such network characteristics have not been well considered in NCS analysis and synthesis. Making use of the information of the statistical distribution of NCS network induced delay, a delay distribution based stochastic model is adopted to link Quality-of-Control and network Quality-of-Service for NCS with uncertainties. From this model together with a tighter bounding technology for cross terms, H∞ NCS analysis is carried out with significantly improved stability results. Furthermore, a memoryless H∞ controller is designed to stabilize the NCS and to achieve the prescribed disturbance attenuation level. Numerical examples are given to demonstrate the effectiveness of the proposed method.
Resumo:
This paper reports on a study investigating preferred driving speeds and frequency of speeding of 320 Queensland drivers. Despite growing community concern about speeding and extensive research linking it to road trauma, speeding remains a pervasive, and arguably, socially acceptable behaviour. This presents an apparent paradox regarding the mismatch between beliefs and behaviours, and highlights the necessity to better understand the factors contributing to speeding. Utilising self-reported behaviour and attitudinal measures, results of this study support the notion of a speed paradox. Two thirds of participants agreed that exceeding the limit is not worth the risks nor is it okay to exceed the posted limit. Despite this, more than half (58.4%) of the participants reported a preference to exceed the 100km/hour speed limit, with one third preferring to do so by 10 to 20 km/hour. Further, mean preferred driving speeds on both urban and open roads suggest a perceived enforcement tolerance of 10%, suggesting that posted limits have limited direct influence on speed choice. Factors that significantly predicted the frequency of speeding included: exposure to role models who speed; favourable attitudes to speeding; experiences of punishment avoidance; and the perceived certainty of punishment for speeding. These findings have important policy implications, particularly relating to the use of enforcement tolerances.
Resumo:
Purpose: Students with low vision may be disadvantaged when compared with their normally sighted peers, as they frequently work at very short working distances and need to use low vision devices. The aim of this study was to examine the sustained reading rates of students with low vision and compare them with their peers with normal vision. The effects of visual acuity, acuity reserve and age on reading rate were also examined. Method: Fifty-six students (10 to 16 years of age), 26 with low vision and 30 with normal vision were required to read text continuously for 30 minutes. Their position in the text was recorded at two-minute intervals. Distance and near visual acuity, working distance, cause of low vision, reading rates and reading habits were recorded. Results: A total of 80.7 per cent of the students with low vision maintained a constant reading rate during the 30 minutes of reading, although they read at approximately half the rate (104 wpm) compared with their normally sighted peers (195 wpm). Only four of the low vision subjects could not complete the reading task. Reading rates increased significantly with acuity reserve and distance and near visual acuity but there was no significant relationship between age and sustained reading rate. Conclusions: The majority of students with low vision were able to maintain appropriate reading rates to cope in integrated educational settings. Surprisingly only relatively few subjects (16 per cent) used their prescribed low vision devices even though the average accommodative demand was 9 D and generally, they revealed a greater dislike of reading compared to students with normal vision.
Resumo:
Objective: To investigate the impact of a train-the-trainer program on the nutritional status of older people in residential care. ----- Design: Prospective, randomized controlled study. Setting: Eight nursing homes in Southeast Queensland, Australia. ----- Participants: A total of 352 residents participated - 245 were female (69.6%). The mean age was 84.2 years and the majority (79.4%) were classified as high dependency. ----- Intervention: Residents from four nursing homes were randomly selected for a nutrition education program coordinated by Nutrition Coordinators. Residents from the other four nursing homes (control) received usual care. ----- Measurements: The Subjective Global Assessment was used to determine prevalence of malnutrition at baseline and six months post intervention. The Resident Classification Scale measured functional dependency. Prescribed diet, fluids, oral hygiene status and allied health referrals were obtained by chart audit. ----- Results: Approximately half the residents were well nourished with 49.4% moderately or severely malnourished. Residents in the intervention group were more likely to maintain or improve their nutritional status compared with the control group who were more likely to experience a deterioration (P=0.027). The odds of the control group being malnourished post test was 1.6 times more likely compared with the intervention group but this did not reach statistical significance (P=0.1). ----- Conclusion: The results of the study encourage the implementation of a Nutrition Coordinator program to maintain nutritional status of aged care residents. Nevertheless, malnutrition rates continue to be unacceptably high. In a rapidly aging society, the aged care sector needs to confront malnutrition and provide better resources for staff to take measures against this problem.
Resumo:
Presbyopia affects individuals from the age of 45 years onwards, resulting in difficulty in accurately focusing on near objects. There are many optical corrections available including spectacles or contact lenses that are designed to enable presbyopes to see clearly at both far and near distances. However, presbyopic vision corrections also disturb aspects of visual function under certain circumstances. The impact of these changes on activities of daily living such as driving are, however, poorly understood. Therefore, the aim of this study was to determine which aspects of driving performance might be affected by wearing different types of presbyopic vision corrections. In order to achieve this aim, three experiments were undertaken. The first experiment involved administration of a questionnaire to compare the subjective driving difficulties experienced when wearing a range of common presbyopic contact lens and spectacle corrections. The questionnaire was developed and piloted, and included a series of items regarding difficulties experienced while driving under day and night-time conditions. Two hundred and fifty five presbyopic patients responded to the questionnaire and were categorised into five groups, including those wearing no vision correction for driving (n = 50), bifocal spectacles (BIF, n = 54), progressive addition lenses spectacles (PAL, n = 50), monovision (MV, n = 53) and multifocal contact lenses (MTF CL, n = 48). Overall, ratings of satisfaction during daytime driving were relatively high for all correction types. However, MV and MTF CL wearers were significantly less satisfied with aspects of their vision during night-time than daytime driving, particularly with regard to disturbances from glare and haloes. Progressive addition lens wearers noticed more distortion of peripheral vision, while BIF wearers reported more difficulties with tasks requiring changes in focus and those who wore no vision correction for driving reported problems with intermediate and near tasks. Overall, the mean level of satisfaction for daytime driving was quite high for all of the groups (over 80%), with the BIF wearers being the least satisfied with their vision for driving. Conversely, at night, MTF CL wearers expressed the least satisfaction. Research into eye and head movements has become increasingly of interest in driving research as it provides a means of understanding how the driver responds to visual stimuli in traffic. Previous studies have found that wearing PAL can affect eye and head movement performance resulting in slower eye movement velocities and longer times to stabilize the gaze for fixation. These changes in eye and head movement patterns may have implications for driving safety, given that the visual tasks for driving include a range of dynamic search tasks. Therefore, the second study was designed to investigate the influence of different presbyopic corrections on driving-related eye and head movements under standardized laboratory-based conditions. Twenty presbyopes (mean age: 56.1 ± 5.7 years) who had no experience of wearing presbyopic vision corrections, apart from single vision reading spectacles, were recruited. Each participant wore five different types of vision correction: single vision distance lenses (SV), PAL, BIF, MV and MTF CL. For each visual condition, participants were required to view videotape recordings of traffic scenes, track a reference vehicle and identify a series of peripherally presented targets while their eye and head movements were recorded using the faceLAB® eye and head tracking system. Digital numerical display panels were also included as near visual stimuli (simulating the visual displays of a vehicle speedometer and radio). The results demonstrated that the path length of eye movements while viewing and responding to driving-related traffic scenes was significantly longer when wearing BIF and PAL than MV and MTF CL. The path length of head movements was greater with SV, BIF and PAL than MV and MTF CL. Target recognition was less accurate when the near stimulus was located at eccentricities inferiorly and to the left, rather than directly below the primary position of gaze, regardless of vision correction type. The third experiment aimed to investigate the real world driving performance of presbyopes while wearing different vision corrections measured on a closed-road circuit at night-time. Eye movements were recorded using the ASL Mobile Eye, eye tracking system (as the faceLAB® system proved to be impractical for use outside of the laboratory). Eleven participants (mean age: 57.25 ± 5.78 years) were fitted with four types of prescribed vision corrections (SV, PAL, MV and MTF CL). The measures of driving performance on the closed-road circuit included distance to sign recognition, near target recognition, peripheral light-emitting-diode (LED) recognition, low contrast road hazards recognition and avoidance, recognition of all the road signs, time to complete the course, and driving behaviours such as braking, accelerating, and cornering. The results demonstrated that driving performance at night was most affected by MTF CL compared to PAL, resulting in shorter distances to read signs, slower driving speeds, and longer times spent fixating road signs. Monovision resulted in worse performance in the task of distance to read a signs compared to SV and PAL. The SV condition resulted in significantly more errors made in interpreting information from in-vehicle devices, despite spending longer time fixating on these devices. Progressive addition lenses were ranked as the most preferred vision correction, while MTF CL were the least preferred vision correction for night-time driving. This thesis addressed the research question of how presbyopic vision corrections affect driving performance and the results of the three experiments demonstrated that the different types of presbyopic vision corrections (e.g. BIF, PAL, MV and MTF CL) can affect driving performance in different ways. Distance-related driving tasks showed reduced performance with MV and MTF CL, while tasks which involved viewing in-vehicle devices were significantly hampered by wearing SV corrections. Wearing spectacles such as SV, BIF and PAL induced greater eye and head movements in the simulated driving condition, however this did not directly translate to impaired performance on the closed- road circuit tasks. These findings are important for understanding the influence of presbyopic vision corrections on vision under real world driving conditions. They will also assist the eye care practitioner to understand and convey to patients the potential driving difficulties associated with wearing certain types of presbyopic vision corrections and accordingly to support them in the process of matching patients to optical corrections which meet their visual needs.
Resumo:
The increase of life expectancy worldwide during the last three decades has increased age-related disability leading to the risk of loss of quality of life. How to improve quality of life including physical health and mental health for older people and optimize their life potential has become an important health issue. This study used the Theory of Planned Behaviour Model to examine factors influencing health behaviours, and the relationship with quality of life. A cross-sectional mailed survey of 1300 Australians over 50 years was conducted at the beginning of 2009, with 730 completed questionnaires returned (response rate 63%). Preliminary analysis reveals that physiological changes of old age, especially increasing waist circumference and co morbidity was closely related to health status, especially worse physical health summary score. Physical activity was the least adherent behaviour among the respondents compared to eating healthy food and taking medication regularly as prescribed. Increasing number of older people living alone with co morbidity of disease may be the barriers that influence their attitude and self control toward physical activity. A multidisciplinary and integrated approach including hospital and non hospital care is required to provide appropriate services and facilities toward older people.
Resumo:
For most of the 20th Century a ‘closed’ system of adoption was practised throughout Australia and other modern Western societies. This ‘closed’ system was characterised by sealed records; amended birth certificates to conceal the adoption, and prohibited contact with all biological family. Despite claims that these measures protected these children from the taint of illegitimacy the central motivations were far more complex, involving a desire to protect couples from the stigma of infertility and to provide a socially acceptable family structure (Triseliotis, Feast, & Kyle, 2005; Marshall & McDonald, 2001). From the 1960s significant evidence began to emerge that many adopted children and adults were experiencing higher incidences of psychological difficulties, characterised by problems with psychological adjustment, building self-esteem and forming a secure personal identity. These difficulties became grouped under the term ‘genealogical bewilderment’. As a result, new policies and practices were introduced to try to place the best interests of the child at the forefront. These changes reflected new understandings of adoption; as not only an individual process but also as a social and relational process that continues throughout life. Secrecy and the withholding of birth information are now prohibited in the overwhelming majority of all domestic adoptions processed in Australia (Marshall & McDonald, 2001). One little known consequence of this ‘closed’ system of adoption was the significant number of children who were never told of their adoptive status. As a consequence, some have discovered or had this information disclosed to them, as adults. The first study that looked at the late discovery of genetic origins experiences was conducted by the Post Adoption Resource Centre in New South Wales in 1999. This report found that the participants in their study expressed feelings of disbelief, confusion, anger, sorrow and loss. Further, the majority of participants continued to struggle with issues arising from this intentional concealment of their genetic origins (Perl & Markham, 1999). A second and more recent study (Passmore, Feeney & Foulstone, 2007) looked at the issue of secrecy in adoptive families as part of a broader study of 144 adult adoptees. This study found that secrecy and/or lies or misinformation on the part of adoptive parents had negative effects on both personal identity and relationships with others. The authors noted that those adoptees who found out about their adoption as adults were ‘especially likely to feel a sense of betrayal’ (p.4). Over recent years, stories of secrecy and late discovery have also started to emerge from sperm donor conceived adults (Spencer, 2007; Turner & Coyle, 2000). Current research evidence shows that although a majority of couples during the donor assisted conception process indicate that they intend to tell the offspring about their origins, as many as two-thirds or more of couples continue to withhold this information from their children (Akker, 2006; Gottlieb, A. McWhinnie, 2001; Salter-Ling, Hunter, & Glover, 2001). Why do they keep this secret? Infertility involves a range of complex factors that are often left unresolved or poorly understood by those choosing insemination by donor as a form of family building (Schaffer, J. A., & Diamond, R., 1993). These factors may only impact after the child is born, when resemblance talk becomes most pronounced. Resemblance talk is an accepted form of public discourse and a social convention that legitimises the child as part of the family and is part of the process of constructing the child’s identity within the family. Couples tend to become focused on resemblance as this is where they feel most vulnerable, and the lack of resemblance to the parenting father may trigger his sense of loss (Becker, Butler, & Nachtigall, 2005).
Resumo:
The ageing population highlights the need to provide effective optical solutions for presbyopic contact lens wearers. However, data gathered from annual contact lens fitting surveys demonstrate that fewer than 40% of contact lens wearers over 45 years of age (virtually all of whom can be presumed to suffer a partial or complete loss of accommodation) are prescribed a presbyopic correction. Furthermore, monovision is prescribed as frequently as multifocal lenses. These observations suggest that an optimal solution to the contact lens correction of presbyopia remains elusive.
Resumo:
The current epidemic of paediatric obesity is consistent with a myriad of health-related comorbid conditions. Despite the higher prevalence of orthopaedic conditions in overweight children, a paucity of published research has considered the influence of these conditions on the ability to undertake physical activity. As physical activity participation is directly related to improvements in physical fitness, skeletal health and metabolic conditions, higher levels of physical activity are encouraged, and exercise is commonly prescribed in the treatment and management of childhood obesity. However, research has not correlated orthopaedic conditions, including the increased joint pain and discomfort that is commonly reported by overweight children, with decreases in physical activity. Research has confirmed that overweight children typically display a slower, more tentative walking pattern with increased forces to the hip, knee and ankle during 'normal' gait. This research, combined with anthropometric data indicating a higher prevalence of musculoskeletal malalignment in overweight children, suggests that such individuals are poorly equipped to undertake certain forms of physical activity. Concomitant increases in obesity and decreases in physical activity level strongly support the need to better understand the musculoskeletal factors associated with the performance of motor tasks by overweight and obese children.
Resumo:
Frontline employee behaviours are recognised as vital for achieving a competitive advantage for service organisations. The services marketing literature has comprehensively examined ways to improve frontline employee behaviours in service delivery and recovery. However, limited attention has been paid to frontline employee behaviours that favour customers in ways that go against organisational norms or rules. This study examines these behaviours by introducing a behavioural concept of Customer-Oriented Deviance (COD). COD is defined as, “frontline employees exhibiting extra-role behaviours that they perceive to defy existing expectations or prescribed rules of higher authority through service adaptation, communication and use of resources to benefit customers during interpersonal service encounters.” This thesis develops a COD measure and examines the key determinants of these behaviours from a frontline employee perspective. Existing research on similar behaviours that has originated in the positive deviance and pro-social behaviour domains has limitations and is considered inadequate to examine COD in the services context. The absence of a well-developed body of knowledge on non-conforming service behaviours has implications for both theory and practice. The provision of ‘special favours’ increases customer satisfaction but the over-servicing of customers is also counterproductive for the service delivery and costly for the organisation. Despite these implications of non-conforming service behaviours, there is little understanding about the nature of these behaviours and its key drivers. This research builds on inadequacies in prior research on positive deviance, pro-social and pro-customer literature to develop the theoretical foundation of COD. The concept of positive deviance which has predominantly been used to study organisational behaviours is applied within a services marketing setting. Further, it addresses previous limitations in pro-social and pro-customer behavioural literature that has examined limited forms of behaviours with no clear understanding on the nature of these behaviours. Building upon these literature streams, this research adopts a holistic approach towards the conceptualisation of COD. It addresses previous shortcomings in the literature by providing a well bounded definition, developing a psychometrically sound measure of COD and a conceptually well-founded model of COD. The concept of COD was examined across three separate studies and based on the theoretical foundations of role theory and social identity theory. Study 1 was exploratory and based on in-depth interviews using the Critical Incident Technique (CIT). The aim of Study 1 was to understand the nature of COD and qualitatively identify its key drivers. Thematic analysis was conducted to analyse the data and the two potential dimensions of COD behaviours of Deviant Service Adaptation (DSA) and Deviant Service Communication (DSC) were revealed in the analysis. In addition, themes representing the potential influences of COD were broadly classified as individual factors, situational factors, and organisational factors. Study 2 was a scale development procedure that involved the generation and purification of items for the measure based on two student samples working in customer service roles (Pilot sample, N=278; Initial validation sample, N=231). The results for the reliability and Exploratory Factor Analyses (EFA) on the pilot sample suggested the scale had poor psychometric properties. As a result, major revisions were made in terms of item wordings and new items were developed based on the literature to reflect a new dimension, Deviant Use of Resources (DUR). The revised items were tested on the initial validation sample with the EFA analysis suggesting a four-factor structure of COD. The aim of Study 3 was to further purify the COD measure and test for nomological validity based on its theoretical relationships with key antecedents and similar constructs (key correlates). The theoretical model of COD consisting of nine hypotheses was tested on a retail and hospitality sample of frontline employees (Retail N=311; Hospitality N=305) of a market research panel using an online survey. The data was analysed using Structural Equation Modelling (SEM). The results provided support for a re-specified second-order three-factor model of COD which consists of 11 items. Overall, the COD measure was found to be reliable and valid, demonstrating convergent validity, discriminant validity and marginal partial invariance for the factor loadings. The results showed support for nomological validity, although the antecedents had differing impact on COD across samples. Specifically, empathy and perspective-taking, role conflict, and job autonomy significantly influenced COD in the retail sample, whereas empathy and perspective-taking, risk-taking propensity and role conflict were significant predictors in the hospitality sample. In addition, customer orientation-selling orientation, the altruistic dimension of organisational citizenship behaviours, workplace deviance, and social desirability responding were found to correlate with COD. This research makes several contributions to theory. First, the findings of this thesis extend the literature on positive deviance, pro-social and pro-customer behaviours. Second, the research provides an empirically tested model which describes the antecedents of COD. Third, this research contributes by providing a reliable and valid measure of COD. Finally, the research investigates the differential effects of the key antecedents in different service sectors on COD. The research findings also contribute to services marketing practice. Based on the research findings, service practitioners can better understand the phenomenon of COD and utilise the measurement tool to calibrate COD levels within their organisations. Knowledge on the key determinants of COD will help improve recruitment and training programs and drive internal initiatives within the firm.
Resumo:
Aim: This paper is a report of a study to explore the phenomenon of resilience in the lives of adult patients of mental health services who have experienced mental illness. ---------- Background: Mental illness is a major health concern worldwide, and the majority experiencing it will continue to battle with relapses throughout their lives. However, in many instances people go on to overcome their illness to lead productive and socially engaged lives. Contemporary mental health nursing practice primarily focuses on symptom reduction, and working with resilience has not generally been a consideration. ---------- Method: A descriptive phenomenological study was carried out in 2006. One participant was recruited through advertisements in community newspapers and newsletters and the others using the snowballing method. Information was gathered through in-depth individual interviews which were tape-recorded and subsequently transcribed. Colaizzi's original seven-step approach was used for data analysis, with the inclusion of two additional steps. ---------- Findings: The following themes were identified: Universality, Acceptance, Naming and knowing, Faith, Hope, Being the fool and Striking a balance, Having meaning and meaningful relationships, and 'Just doing it'. The conceptualization identified as encapsulating the themes was 'Viewing life from the ridge with eyes wide open', which involved knowing the risks and dangers ahead and making a decision for life amid ever-present hardships. ---------- Conclusion: Knowledge about resilience should be included in the theoretical and practical education of nursing students and experienced nurses. Early intervention, based on resilience factors identified through screening processes, is needed for people with mental illness.
Resumo:
Building on the investigation of the Charity Commission (2009) on the effects of the economic downturn on the largest trusts and foundation in the United Kingdom, the purpose of this research was to assess the extent to which Australian trusts and foundations were taking an actively strategic approach to their investments and pursuit of mission (including grant-making), and the relationship between the two in the context of the economic downturn. Focus was given to identifying the issues raised as a consequence of the economic downturn, rather than providing a generalised snapshot of the ‘average’ foundations response. In September 2009, semi-structured, in depth interviews were conducted with executives of 23 grant making trusts and foundations. The interviews for this research focused on the largest grant makers in terms of grant expenditure, however included foundations from different geographical locations and from across different cause areas. It is important to stress at the outset that this was not a representative sample of foundations; the study aimed to identify issues rather than to present a representative picture of the ‘average’ foundation’s response. It is also important to note that the study was undertaken in September 2009 at a time when many foundations were beginning to feel more optimistic about the longer term future, but aware of continuing and possibly worsening short term income problems. But whatever the financial future, some of the underlying issues, concerning investment and grant making management practices, raised in this report will be of continuing relevance worthy of wider discussion. If a crisis is too good to waste, it is also too good to forget. One other introductory point – as previously noted, interviews for this study were conducted in September 2009 – just one month prior to the introduction of the new Private Ancillary Fund (PAF) legislation which replaced the previous Prescribed Private Fund (PPF) arrangement1. References to PAFs and/or PPFs reflect that time.