424 resultados para Month
Resumo:
Research found that today’s organisations are increasingly aware of the potential barriers and perceived challenges associated with the successful delivery of change — including cultural and sub-cultural indifferences; financial constraints; restricted timelines; insufficient senior management support; fragmented key stakeholder commitment; and inadequate training. The delivery and application of Innovative Change (see glossary) within a construction industry organisation tends to require a certain level of ‘readiness’. This readiness is the combination of an organisation’s ability to part from undertakings that may be old, traditional, or inefficient; and then being able to readily adopt a procedure or initiative which is new, improved, or more efficient. Despite the construction industry’s awareness of the various threats and opportunities associated with the delivery of change, research found little attention is currently given to develop a ‘decision-making framework’ that comprises measurable elements (dynamics) that may assist in more accurately determining an organisation’s level of readiness or ability to deliver innovative change. To resolve this, an initial Background Literature Review in 2004 identified six such dynamics, those of Change, Innovation, Implementation, Culture, Leadership, and Training and Education, which were then hypothesised to be key components of a ‘Conceptual Decision-making Framework’ (CDF) for delivering innovative change within an organisation. To support this hypothesis, a second (more extensive) Literature Review was undertaken from late 2007 to mid 2009. A Delphi study was embarked on in June 2008, inviting fifteen building and construction industry members to form a panel and take part in a Delphi study. The selection criterion required panel members to have senior positions (manager and above) within a recognised field or occupation, and to have experience, understanding and / or knowledge in the process of delivering change within organisations. The final panel comprised nine representatives from private and public industry organisations and tertiary / research and development (R&D) universities. The Delphi study developed, distributed and collated two rounds of survey questionnaires over a four-month period, comprising open-ended and closed questions (referred to as factors). The first round of Delphi survey questionnaires were distributed to the panel in August 2008, asking them to rate the relevancy of the six hypothesised dynamics. In early September 2008, round-one responses were returned, analysed and documented. From this, an additional three dynamics were identified and confirmed by the panel as being highly relevant during the decision-making process when delivering innovative change within an organisation. The additional dynamics (‘Knowledge-sharing and Management’; ‘Business Process Requirements’; and ‘Life-cycle Costs’) were then added to the first six dynamics and used to populate the second (final) Delphi survey questionnaire. This was distributed to the same nine panel members in October 2008, this time asking them to rate the relevancy of all nine dynamics. In November 2008, round-two responses were returned, analysed, summarised and documented. Final results confirmed stability in responses and met Delphi study guidelines. The final contribution is twofold. Firstly, findings confirm all nine dynamics as key components of the proposed CDF for delivering innovative change within an organisation. Secondly, the future development and testing of an ‘Innovative Change Delivery Process’ (ICDP) is proposed, one that is underpinned by an ‘Innovative Change Decision-making Framework’ (ICDF), an ‘Innovative Change Delivery Analysis’ (ICDA) program, and an ‘Innovative Change Delivery Guide’ (ICDG).
Resumo:
Gaining an improved understanding of people diagnosed with schizophrenia has the potential to influence priorities for therapy. Psychosis is commonly understood through the perspective of the medical model. However, the experience of social context surrounding psychosis is not well understood. In this research project we used a phenomenological methodology with a longitudinal design to interview 7 participants across a 12-month period to understand the social experiences surrounding psychosis. Eleven themes were explicated and divided into two phases of the illness experience: (a) transition into emotional shutdown included the experiences of not being acknowledged, relational confusion, not being expressive, detachment, reliving the past, and having no sense of direction; and (b) recovery from emotional shutdown included the experiences of being acknowledged, expression, resolution, independence, and a sense of direction. The experiential themes provide clinicians with new insights to better assess vulnerability, and have the potential to inform goals for therapy.
Resumo:
The aim of the research program was to evaluate the heat strain, hydration status, and heat illness symptoms experienced by surface mine workers. An initial investigation involved 91 surface miners completing a heat stress questionnaire; assessing the work environment, hydration practices, and heat illness symptom experience. The key findings included 1) more than 80 % of workers experienced at least one symptom of heat illness over a 12 month period; and 2) the risk of moderate symptoms of heat illness increased with the severity of dehydration. These findings highlight a health and safety concern for surface miners, as experiencing symptoms of heat illness is an indication that the physiological systems of the body may be struggling to meet the demands of thermoregulation. To illuminate these findings a field investigation to monitor the heat strain and hydration status of surface miners was proposed. Two preliminary studies were conducted to ensure accurate and reliable data collection techniques. Firstly, a study was undertaken to determine a calibration procedure to ensure the accuracy of core body temperature measurement via an ingestible sensor. A water bath was heated to several temperatures between 23 . 51 ¢ªC, allowing for comparison of the temperature recorded by the sensors and a traceable thermometer. A positive systematic bias was observed and indicated a need for calibration. It was concluded that a linear regression should be developed for each sensor prior to ingestion, allowing for a correction to be applied to the raw data. Secondly, hydration status was to be assessed through urine specific gravity measurement. It was foreseeable that practical limitations on mine sites would delay the time between urine collection and analysis. A study was undertaken to assess the reliability of urine analysis over time. Measurement of urine specific gravity was found to be reliable up to 24 hours post urine collection and was suitable to be used in the field study. Twenty-nine surface miners (14 drillers [winter] and 15 blast crew [summer]) were monitored during a normal work shift. Core body temperature was recorded continuously. Average mean core body temperature was 37.5 and 37.4 ¢ªC for blast crew and drillers, with average maximum body temperatures of 38.0 and 37.9 ¢ªC respectively. The highest body temperature recorded was 38.4 ¢ªC. Urine samples were collected at each void for specific gravity measurement. The average mean urine specific gravity was 1.024 and 1.021 for blast crew and drillers respectively. The Heat Illness Symptoms Index was used to evaluate the experience of heat illness symptoms on shift. Over 70 % of drillers and over 80 % of blast crew reported at least one symptom. It was concluded that 1) heat strain remained within the recommended limits for acclimatised workers; and 2) the majority of workers were dehydrated before commencing their shift, and tend to remain dehydrated for the duration. Dehydration was identified as the primary issue for surface miners working in the heat. Therefore continued study focused on investigating a novel approach to monitoring hydration status. The final aim of this research program was to investigate the influence dehydration has on intraocular pressure (IOP); and subsequently, whether IOP could provide a novel indicator of hydration status. Seven males completed 90 minutes of walking in both a cool and hot climate with fluid restriction. Hydration variables and intraocular pressure were measured at baseline and at 30 minute intervals. Participants became dehydrated during the trial in the heat but maintained hydration status in the cool. Intraocular pressure progressively declined in the trial in the heat but remained relatively stable when hydration was maintained. A significant relationship was observed between intraocular pressure and both body mass loss and plasma osmolality. This evidence suggests that intraocular pressure is influenced by changes in hydration status. Further research is required to determine if intraocular pressure could be utilised as an indirect indicator of hydration status.
Resumo:
While supportive-expressive group therapy (SEGT) has been found to be effective in significantly reducing distress associated with life-threatening illness, the challenge in Australia is to develop a means of providing supportive interventions to rural women who may be isolated both by the experience of illness and by geographical location. In this study an adaptation of SEGT was provided to women with metastatic breast cancer (n =21), who attended face-to-face or by telephone conference call. Participants showed significant gains on standardised measures of well-being, including a reduction in negative affect and an increase in positive affect, over a 12-month period. A reduction in intrusive and avoidant stress symptoms was also observed over 12 months; however, this difference was not significant. These outcomes suggest that SEGT, delivered in an innovative way within a community setting, may be an effective means of moderating the adverse effects of a diagnosis of metastatic breast cancer while improving access to supportive care for rural women. These results are considered exploratory, as the study did not include a matched control group.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
The Upper Roper River is one of the Australia’s unique tropical rivers which have been largely untouched by development. The Upper Roper River catchment comprises the sub-catchments of the Waterhouse River and Roper Creek, the two tributaries of the Roper River. There is a complex geological setting with different aquifer types. In this seasonal system, close interaction between surface water and groundwater contributes to both streamflow and sustaining ecosystems. The interaction is highly variable between seasons. A conceptual hydrogeological model was developed to investigate the different hydrological processes and geochemical parameters, and determine the baseline characteristics of water resources of this pristine catchment. In the catchment, long term average rainfall is around 850 mm and is summer dominant which significantly influences the total hydrological system. The difference between seasons is pronounced, with high rainfall up to 600 mm/month in the wet season, and negligible rainfall in the dry season. Canopy interception significantly reduces the amount of effective rainfall because of the native vegetation cover in the pristine catchment. Evaporation exceeds rainfall the majority of the year. Due to elevated evaporation and high temperature in the tropics, at least 600 mm of annual rainfall is required to generate potential recharge. Analysis of 120 years of rainfall data trend helped define “wet” and “dry periods”: decreasing trend corresponds to dry periods, and increasing trend to wet periods. The period from 1900 to 1970 was considered as Dry period 1, when there were years with no effective rainfall, and if there was, the intensity of rainfall was around 300 mm. The period 1970 – 1985 was identified as the Wet period 2, when positive effective rainfall occurred in almost every year, and the intensity reached up to 700 mm. The period 1985 – 1995 was the Dry period 2, with similar characteristics as Dry period 1. Finally, the last decade was the Wet period 2, with effective rainfall intensity up to 800 mm. This variability in rainfall over decades increased/decreased recharge and discharge, improving/reducing surface water and groundwater quantity and quality in different wet and dry periods. The stream discharge follows the rainfall pattern. In the wet season, the aquifer is replenished, groundwater levels and groundwater discharge are high, and surface runoff is the dominant component of streamflow. Waterhouse River contributes two thirds and Roper Creek one third to Roper River flow. As the dry season progresses, surface runoff depletes, and groundwater becomes the main component of stream flow. Flow in Waterhouse River is negligible, the Roper Creek dries up, but the Roper River maintains its flow throughout the year. This is due to the groundwater and spring discharge from the highly permeable Tindall Limestone and tufa aquifers. Rainfall seasonality and lithology of both the catchment and aquifers are shown to influence water chemistry. In the wet season, dilution of water bodies by rainwater is the main process. In the dry season, when groundwater provides baseflow to the streams, their chemical composition reflects lithology of the aquifers, in particular the karstic areas. Water chemistry distinguishes four types of aquifer materials described as alluvium, sandstone, limestone and tufa. Surface water in the headwaters of the Waterhouse River, the Roper Creek and their tributaries are freshwater, and reflect the alluvium and sandstone aquifers. At and downstream of the confluence of the Roper River, river water chemistry indicates the influence of rainfall dilution in the wet season, and the signature of the Tindall Limestone and tufa aquifers in the dry. Rainbow Spring on the Waterhouse River and Bitter Spring on the Little Roper River (known as Roper Creek at the headwaters) discharge from the Tindall Limestone. Botanic Walk Spring and Fig Tree Spring discharge into the Roper River from tufa. The source of water was defined based on water chemical composition of the springs, surface and groundwater. The mechanisms controlling surface water chemistry were examined to define the dominance of precipitation, evaporation or rock weathering on the water chemical composition. Simple water balance models for the catchment have been developed. The important aspects to be considered in water resource planning of this total system are the naturally high salinity in the region, especially the downstream sections, and how unpredictable climate variation may impact on the natural seasonal variability of water volumes and surface-subsurface interaction.
Resumo:
Objectives The p38 mitogen-activated protein kinase (MAPK) signal transduction pathway is involved in a variety of inflammatory responses, including cytokine generation, cell differentiation proliferation and apoptosis. Here, we examined the effects of systemic p38 MAPK inhibition on cartilage cells and osteoarthritis (OA) disease progression by both in vitro and in vivo approaches. Methods p38 kinase activity was evaluated in normal and OA cartilage cells by measuring the amount of phosphorylated protein. To examine the function of p38 signaling pathway in vitro, normal chondrocytes were isolated and differentiated in the presence or absence of p38 inhibitor; SB203580 and analysed for chondrogenic phenotype. Effect of systemic p38 MAPK inhibition in normal and OA (induced by menisectomy) rats were analysed by treating animals with vehicle alone (DMS0) or p38 inhibitor (SB203580). Damage to the femur and tibial plateau was evaluated by modified Mankin score, histology and immunohistochemistry. Results Our in vitro studies have revealed that a down-regulation of chondrogenic and increase of hypertrophic gene expression occurs in the normal chondrocytes, when p38 is neutralized by a pharmacological inhibitor. We further observed that the basal levels of p38 phosphorylation were decreased in OA chondrocytes compared with normal chondrocytes. These findings together indicate the importance of this pathway in the regulation of cartilage physiology and its relevance to OA pathogenesis. At in vivo level, systematic administration of a specific p38 MAPK inhibitor, SB203580, continuously for over a month led to a significant loss of proteoglycan; aggrecan and cartilage thickness. On the other hand, SB203580 treated normal rats showed a significant increase in TUNEL positive cells, cartilage hypertrophy markers such as Type 10 collagen, Runt-related transcription factor and Matrix metalloproteinase-13 and substantially induced OA like phenotypic changes in the normal rats. In addition, menisectomy induced OA rat models that were treated with p38 inhibitor showed aggravation of cartilage damage. Conclusions In summary, this study has provided evidence that the component of the p38 MAPK pathway is important to maintain the cartilage health and its inhibition can lead to severe cartilage degenerative changes. The observations in this study highlight the possibility of using activators of the p38 pathway as an alternative approach in the treatment of OA.
Resumo:
Background Anemia due to iron deficiency is recognized as one of the major nutritional deficiencies in women and children in developing countries. Daily iron supplementation for pregnant women is recommended in many countries although there are few reports of these programs working efficiently or effectively. Weekly iron-folic acid supplementation (WIFS) and regular deworming treatment is recommended for non-pregnant women living in areas with high rates of anemia. Following a baseline survey to assess the prevalence of anemia, iron deficiency and soil transmitted helminth infections, we implemented a program to make WIFS and regular deworming treatment freely and universally available for all women of reproductive age in two districts of a province in northern Vietnam over a 12 month period. The impact of the program at the population level was assessed in terms of: i) change in mean hemoglobin and iron status indicators, and ii) change in the prevalence of anemia, iron deficiency and hookworm infections. Method Distribution of WIFS and deworming were integrated with routine health services and made available to 52,000 women. Demographic data and blood and stool samples were collected in baseline, and three and 12-month post-implementation surveys using a population-based, stratified multi-stage cluster sampling design. Results The mean Hb increased by 9.6 g/L (95% CI, 5.7, 13.5, p < 0.001) during the study period. Anemia (Hb<120 g/L) was present in 131/349 (37.5%, 95% CI 31.3, 44.8) subjects at baseline, and in 70/363 (19.3%, 95% CI 14.0, 24.6) after twelve months. Iron deficiency reduced from 75/329 (22.8%, 95% CI 16.9, 28.6) to 33/353 (9.3%, 95% CI 5.7, 13.0) by the 12-mnth survey, and hookworm infection from 279/366 (76.2%,, 95% CI 68.6, 83.8) to 66/287 (23.0%, 95% CI 17.5, 28.5) over the same period. Conclusion A free, universal WIFS program with regular deworming was associated with reduced prevalence and severity of anemia, iron deficiency and ho
Resumo:
Background Birth weight and length have seasonal fluctuations. Previous analyses of birth weight by latitude effects identified seemingly contradictory results, showing both 6 and 12 monthly periodicities in weight. The aims of this paper are twofold: (a) to explore seasonal patterns in a large, Danish Medical Birth Register, and (b) to explore models based on seasonal exposures and a non-linear exposure-risk relationship. Methods Birth weight and birth lengths on over 1.5 million Danish singleton, live births were examined for seasonality. We modelled seasonal patterns based on linear, U- and J-shaped exposure-risk relationships. We then added an extra layer of complexity by modelling weighted population-based exposure patterns. Results The Danish data showed clear seasonal fluctuations for both birth weight and birth length. A bimodal model best fits the data, however the amplitude of the 6 and 12 month peaks changed over time. In the modelling exercises, U- and J-shaped exposure-risk relationships generate time series with both 6 and 12 month periodicities. Changing the weightings of the population exposure risks result in unexpected properties. A J-shaped exposure-risk relationship with a diminishing population exposure over time fitted the observed seasonal pattern in the Danish birth weight data. Conclusion In keeping with many other studies, Danish birth anthropometric data show complex and shifting seasonal patterns. We speculate that annual periodicities with non-linear exposure-risk models may underlie these findings. Understanding the nature of seasonal fluctuations can help generate candidate exposures.
Resumo:
Executive coaching is a rapidly expanding approach to leadership development which has grown at a rate that warrants extensive examination of its effects (Wasylyshyn, 2003). This thesis has therefore examined both behavioural and psychological effects based on a nine month executive coaching intervention within a large not-for-profit organisation. The intervention was a part of a larger ongoing integrated organisational strategy to create an organisational coaching culture. In order to examine the effectiveness of the nine month executive coaching intervention two studies were conducted. A quantitative study used a pre and post questionnaire to examine leaders and their team members‘ responses before and after the coaching intervention. The research examined leader-empowering behaviours, psychological empowerment, job satisfaction and affective commitment. Significant results were demonstrated from leaders‘ self-reports on leader-empowering behaviours and their team members‘ self-reports revealed a significant flow on effect of psychological empowerment. The second part of the investigation involved a qualitative study which explored the developmental nature of psychological empowerment through executive coaching. The examination dissected psychological empowerment into its widely accepted four facets of meaning, impact, competency and self-determination and investigated, through semi-structured interviews, leaders‘ perspectives of the effect of executive coaching upon them (Spreitzer, 1992). It was discovered that a number of the common practices within executive coaching, including goal-setting, accountability and action-reflection, contributed to the production of outcomes that developed higher levels of psychological empowerment. Careful attention was also given to organisational context and its influence upon the outcomes.
Resumo:
The study of venture idea characteristics and the contextual fit between venture ideas and individuals are key research goals in entrepreneurship (Davidsson, 2004). However, to date there has been limited scholarly attention given to these phenomena. Accordingly, this study aims to help fill the gap by investigating the importance of novelty and relatedness of venture ideas in entrepreneurial firms. On the premise that new venture creation is a process and that research should be focused on the early stages of the venturing process, this study primarily focuses its attention on examining how venture idea novelty and relatedness affect the performance in the venture creation process. Different types and degrees of novelty are considered here. Relatedness is shown to be based on individuals’ prior knowledge and resource endowment. Performance in the venture creation process is evaluated according to four possible outcomes: making progress, getting operational, being terminated and achieving positive cash flow. A theoretical model is developed demonstrating the relationship between these variables along with the investment of time and money. Several hypotheses are developed to be tested. Among them, it is hypothesised that novelty hinders short term performance in the venture creation process. On the other hand knowledge and resource relatedness are hypothesised to promote performance. An experimental study was required in order to understand how different types and degrees of novelty and relatedness of venture ideas affect the attractiveness of venture ideas in the eyes of experienced entrepreneurs. Thus, the empirical work in this thesis was based on two separate studies. In the first one, a conjoint analysis experiment was conducted on 32 experienced entrepreneurs in order to ascertain attitudinal preferences regarding venture idea attractiveness based on novelty, relatedness and potential financial gains. This helped to estimate utility values for different levels of different attributes of venture ideas and their relative importance in the attractiveness. The second study was a longitudinal investigation of how venture idea novelty and relatedness affect the performance in the venture creation process. The data for this study is from the Comprehensive Australian Study for Entrepreneurial Emergence (CAUSEE) project that has been established in order to explore the new venture creation process in Australia. CAUSEE collects data from a representative sample of over 30,000 households in Australia using random digit dialling (RDD) telephone interviews. From these cases, data was collected at two points in time during a 12 month period from 493 firms, who are currently involved in the start-up process. Hypotheses were tested and inferences were derived through descriptive statistics, confirmatory factor analysis and structural equation modelling. Results of study 1 indicate that venture idea characteristics have a role in the attractiveness and entrepreneurs prefer to introduce a moderate degree of novelty across all types of venture ideas concerned. Knowledge relatedness is demonstrated to be a more significant factor in attractiveness than resource relatedness. Results of study 2 show that the novelty hinders nascent venture performance. On the other hand, resource relatedness has a positive impact on performance unlike knowledge relatedness which has none. The results of these studies have important implications for potential entrepreneurs, investors, researchers, consultants etc. by developing a better understanding in the venture creation process and its success factors in terms of both theory and practice.
Resumo:
PURPOSE: To examine the relationship between contact lens (CL) case contamination and various potential predictive factors. METHODS: 74 subjects were fitted with lotrafilcon B (CIBA Vision) CLs on a daily wear basis for 1 month. Subjects were randomly assigned one of two polyhexamethylene biguanide (PHMB) preserved disinfecting solutions with the corresponding regular lens case. Clinical evaluations were conducted at lens delivery and after 1 month, when cases were collected for microbial culture. A CL care non-compliance score was determined through administration of a questionnaire and the volume of solution used was calculated for each subject. Data was examined using backward stepwise binary logistic regression. RESULTS: 68% of cases were contaminated. 35% were moderately or heavily contaminated and 36% contained gram-negative bacteria. Case contamination was significantly associated with subjective dryness symptoms (OR 4.22, CI 1.37–13.01) (P<0.05). There was no association between contamination and subject age, ethnicity, gender, average wearing time, amount of solution used, non-compliance score, CL power and subjective redness (P>0.05). The effect of lens care system on case contamination approached significance (P=0.07). Failure to rinse the case with disinfecting solution following CL insertion (OR 2.51, CI 0.52–12.09) and not air drying the case (OR 2.31, CI 0.39–13.35) were positively correlated with contamination; however, did not reach statistical significance. CONCLUSIONS: Our results suggest that case contamination may influence subjective comfort. It is difficult to predict the development of case contamination from a variety of clinical factors. The efficacy of CL solutions, bacterial resistance to disinfection and biofilm formation are likely to play a role. Further evaluation of these factors will improve our understanding of the development of case contamination and its clinical impact.
Resumo:
A new measure of work-related self-efficacy for people with psychiatric disabilities is reported. The 37-item scale measures self-efficacy in four relevant activity domains: 1) vocational service access and career planning, 2) job acquisition, 3) work-related social skills, and 4) general work skills. The scale was developed in a 12-month longitudinal survey of urban residents diagnosed with schizophrenia or schizoaffective disorder (n = 104). Results indicate validity of both a four-factor structure differentiating four core skill domains, and a single factor representing total work-related self-efficacy. The favorable psychometric properties support further research and trial applications in supported employment and psychiatric vocational rehabilitation.
Resumo:
Objective: To develop a self-report scale of subjective experiences of illness perceived to impact on employment functioning, as an alternative to a diagnostic perspective, for anticipating the vocational assistance needs of people with schizophrenia or schizoaffective disorders. Method: A repeated measures pilot study (n1 = 26, n2 = 21) of community residents with schizophrenia identified a set of work-related subjective experiences perceived to impact on employment functioning. Items with the best psychometric properties were applied in a 12 month longitudinal survey of urban residents with schizophrenia or schizoaffective disorder (n1 = 104; n2 = 94; n3 = 94). Results: Construct validity, factor structure, responsiveness, internal consistency, stability, and criterion validity investigations produced favourable results. Work-related subjective experiences provide information about the intersection of the person, the disorder, and expectations of employment functioning, which suggest new opportunities for vocational professionals to explore and discuss individual assistance needs. Conclusion: Further psychometric investigations of test-retest reliability, discriminant and predictive validity, and research applications in supported employment and vocational rehabilitation, are recommended. Subject to adequate psychometric properties, the new measure promises to facilitate exploring: individuals' specific subjective experiences; how each is perceived to contribute to employment restrictions; and the corresponding implications for specialized treatment, vocational interventions and workplace accommodations.
Resumo:
Adults diagnosed with primary brain tumours often experience physical, cognitive and neuropsychiatric impairments and decline in quality of life. Although disease and treatment-related information is commonly provided to cancer patients and carers, newly diagnosed brain tumour patients and their carers report unmet information needs. Few interventions have been designed or proven to address these information needs. Accordingly, a three-study research program, that incorporated both qualitative and quantitative research methods, was designed to: 1) identify and select an intervention to improve the provision of information, and meet the needs of patients with a brain tumour; 2) use an evidence-based approach to establish the content, language and format for the intervention; and 3) assess the acceptability of the intervention, and the feasibility of evaluation, with newly diagnosed brain tumour patients. Study 1: Structured concept mapping techniques were undertaken with 30 health professionals, who identified strategies or items for improving care, and rated each of 42 items for importance, feasibility, and the extent to which such care was provided. Participants also provided data to interpret the relationship between items, which were translated into ‘maps’ of relationships between information and other aspects of health care using multidimensional scaling and hierarchical cluster analysis. Results were discussed by participants in small groups and individual interviews to understand the ratings, and facilitators and barriers to implementation. A care coordinator was rated as the most important strategy by health professionals. Two items directly related to information provision were also seen as highly important: "information to enable the patient or carer to ask questions" and "for doctors to encourage patients to ask questions". Qualitative analyses revealed that information provision was individualised, depending on patients’ information needs and preferences, demographic variables and distress, the characteristics of health professionals who provide information, the relationship between the individual patient and health professional, and influenced by the fragmented nature of the health care system. Based on quantitative and qualitative findings, a brain tumour specific question prompt list (QPL) was chosen for development and feasibility testing. A QPL consists of a list of questions that patients and carers may want to ask their doctors. It is designed to encourage the asking of questions in the medical consultation, allowing patients to control the content, and amount of information provided by health professionals. Study 2: The initial structure and content of the brain tumour specific QPL developed was based upon thematic analyses of 1) patient materials for brain tumour patients, 2) QPLs designed for other patient populations, and 3) clinical practice guidelines for the psychosocial care of glioma patients. An iterative process of review and refinement of content was undertaken via telephone interviews with a convenience sample of 18 patients and/or carers. Successive drafts of QPLs were sent to patients and carers and changes made until no new topics or suggestions arose in four successive interviews (saturation). Once QPL content was established, readability analyses and redrafting were conducted to achieve a sixth-grade reading level. The draft QPL was also reviewed by eight health professionals, and shortened and modified based on their feedback. Professional design of the QPL was conducted and sent to patients and carers for further review. The final QPL contained questions in seven colour-coded sections: 1) diagnosis; 2) prognosis; 3) symptoms and problems; 4) treatment; 5) support; 6) after treatment finishes; and 7) the health professional team. Study 3: A feasibility study was conducted to determine the acceptability of the QPL and the appropriateness of methods, to inform a potential future randomised trial to evaluate its effectiveness. A pre-test post-test design was used with a nonrandomised control group. The control group was provided with ‘standard information’, the intervention group with ‘standard information’ plus the QPL. The primary outcome measure was acceptability of the QPL to participants. Twenty patients from four hospitals were recruited a median of 1 month (range 0-46 months) after diagnosis, and 17 completed baseline and follow-up interviews. Six participants would have preferred to receive the information booklet (standard information or QPL) at a different time, most commonly at diagnosis. Seven participants reported on the acceptability of the QPL: all said that the QPL was helpful, and that it contained questions that were useful to them; six said it made it easier to ask questions. Compared with control group participants’ ratings of ‘standard information’, QPL group participants’ views of the QPL were more positive; the QPL had been read more times, was less likely to be reported as ‘overwhelming’ to read, and was more likely to prompt participants to ask questions of their health professionals. The results from the three studies of this research program add to the body of literature on information provision for brain tumour patients. Together, these studies suggest that a QPL may be appropriate for the neuro-oncology setting and acceptable to patients. The QPL aims to assist patients to express their information needs, enabling health professionals to better provide the type and amount of information that patients need to prepare for treatment and the future. This may help health professionals meet the challenge of giving patients sufficient information, without providing ‘too much’ or ‘unnecessary’ information, or taking away hope. Future studies with rigorous designs are now needed to determine the effectiveness of the QPL.