92 resultados para Accuracy of Hotel Feasibility Study Projections


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to assess the feasibility of a home-based exercise program and examine the effects on the healing rates of venous leg ulcers. A 12 –week randomised controlled trial was conducted investigating the effects of an exercise intervention compared to a usual care group. Participants in both groups (n = 13) had active venous ulceration and were treated in a metropolitan hospital outpatients clinic in Australia. Data were collected on recruitment from medical records, clinical assessment and questionnaires. Follow-up data on progress in healing and treatments were collected fortnightly for 12 weeks. Calf muscle pump function data were collected at baseline and 12 weeks from recruitment. Range of ankle motion data were collected at baseline, 6 and 12 weeks from recruitment. This pilot study indicated that the intervention was feasible. Clinical significance was observed in the intervention group with a 32% greater decrease in ulcer size (p=0.34) than the control group, and a 10% (p=0.74) improvement in the number of participants healed in the intervention group compared to the control group. Significant differences between groups over time were observed in calf muscle pump function parameters; (ejection fraction [p = 0.05]; residual volume fraction [p = 0.04]) and range of ankle motion (p = 0.01). This pilot study is one of the first studies to examine and measure clinical healing rates for participants involved in a home-based progressive resistance exercise program. Further research is warranted with a larger multi-site study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this submission, we provide evidence for our view that copyright policy in the UK must encourage new digital business models which meet the changing needs of consumers and foster innovation in the UK both within, and beyond, the creative industries. We illustrate our arguments using evidence from the music industry. However, we believe that our key points on the relationship between the copyright system and innovative digital business models apply across the UK creative industries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: When experiencing sleep problems for the first time, consumers may often approach community pharmacists for advice as they are easily accessible health care professionals in the community. In Australian community pharmacies there are no specific tools available for use by pharmacists to assist with the assessment and handling of consumers with sleep enquiries. Objective: To assess the feasibility of improving the detection of sleep disorders within the community through the pilot of a newly developed Community Pharmacy Sleep Assessment Tool (COP-SAT). Method: The COP-SAT was designed to incorporate elements from a number of existing, standardized, and validated clinical screening measures. The COP-SAT was trialed in four Australian community pharmacies over a 4-week period. Key findings: A total of 241 community pharmacy consumers were assessed using the COP-SAT. A total of 74 (30.7%) were assessed as being at risk of insomnia, 26 (10.7%) were at risk of daytime sleepiness, 19 (7.9%) were at risk of obstructive sleep apnea, and 121 (50.2%) were regular snorers. A total of 116 (48.1%) participants indicated that they consume caffeine before bedtime, of which 55 (47%) had associated symptoms of sleep onset insomnia. Moreover, 85 (35%) consumed alcohol before bedtime, of which 50 (58%) experienced fragmented sleep, 50 (58%) were regular snorers, and nine (10.6%) had apnea symptoms. The COP-SAT was feasible in the community pharmacy setting. The prevalence of sleep disorders in the sampled population was high, but generally consistent with previous studies on the general population. Conclusion: A large proportion of participants reported sleep disorder symptoms, and a link was found between the consumption of alcohol and caffeine substances at bedtime and associated symptoms. While larger studies are needed to assess the clinical properties of the tool, the results of this feasibility study have demonstrated that the COP-SAT may be a practical tool for the identification of patients at risk of developing sleep disorders in the community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental proposition is that the accuracy of the designer's tender price forecasts is positively correlated with the amount of information available for that project. The paper describes an empirical study of the effects of the quantity of information available on practicing Quantity Surveyors' forecasting accuracy. The methodology involved the surveyors repeatedly revising tender price forecasts on receipt of chunks of project information. Each of twelve surveyors undertook two projects and selected information chunks from a total of sixteen information types. The analysis indicated marked differences in accuracy between different project types and experts/non-experts. The expert surveyors' forecasts were not found to be significantly improved by information other than that of basic building type and size, even after eliminating project type effects. The expert surveyors' forecasts based on the knowledge of building type and size alone were, however, found to be of similar accuracy to that of average practitioners pricing full bills of quantities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study examined the accuracy of maternal perceived child weight. Urban-affluent mothers with 111 children aged 2-5 years were recruited. Nearly a quarter of mothers overestimated their underweight child as healthyweight and all overweight/obese children were perceived as healthyweight. Mothers therefore were unable to recognize their child’s true weight status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mine site water balance is important for communicating information to interested stakeholders, for reporting on water performance, and for anticipating and mitigating water-related risks through water use/demand forecasting. Gaining accuracy over the water balance is therefore crucial for sites to achieve best practice water management and to maintain their social license to operate. For sites that are located in high rainfall environments the water received to storage dams through runoff can represent a large proportion of the overall inputs to site; inaccuracies in these flows can therefore lead to inaccuracies in the overall site water balance. Hydrological models that estimate runoff flows are often incorporated into simulation models used for water use/demand forecasting. The Australian Water Balance Model (AWBM) is one example that has been widely applied in the Australian context. However, the calibration of AWBM in a mining context can be challenging. Through a detailed case study, we outline an approach that was used to calibrate and validate AWBM at a mine site. Commencing with a dataset of monitored dam levels, a mass balance approach was used to generate an observed runoff sequence. By incorporating a portion of this observed dataset into the calibration routine, we achieved a closer fit between the observed vs. simulated dataset compared with the base case. We conclude by highlighting opportunities for future research to improve the calibration fit through improving the quality of the input dataset. This will ultimately lead to better models for runoff prediction and thereby improve the accuracy of mine site water balances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aims to assess the accuracy of Digital Elevation Model (DEM) which is generated by using Toutin’s model. Thus, Toutin’s model was run by using OrthoEngineSE of PCI Geomatics 10.3.Thealong-track stereoimages of Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) sensor with 15 m resolution were used to produce DEM on an area with low and near Mean Sea Level (MSL) elevation in Johor Malaysia. Despite the satisfactory pre-processing results the visual assessment of the DEM generated from Toutin’s model showed that the DEM contained many outliers and incorrect values. The failure of Toutin’s model may mostly be due to the inaccuracy and insufficiency of ASTER ephemeris data for low terrains as well as huge water body in the stereo images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The understanding of the loads generated within the prosthetic leg can aid engineers in the design of components and clinicians in the process of rehabilitation. Traditional methods to assess these loads have relied on inverse dynamics. This indirect method estimates the applied load using video recordings and force-plates located at a distance from the region of interest, such as the base of the residuum. The well-known limitations of this method are related to the accuracy of this recursive model and the experimental conditions required (Frossard et al., 2003). Recent developments in sensors (Frossard et al., 2003) and prosthetic fixation (Brånemark et al., 2000) permit the direct measurement of the loads applied on the residuum of transfemoral amputees. In principle, direct measurement should be an appropriate tool for assessing the accuracy of inverse dynamics. The purpose of this paper is to determine the validity of this assumption. The comparative variable used in this study is the velocity of the relative body center of mass (VCOM(t)). The relativity is used to align the static (w.r.t. position) force plate measurement with the dynamic load cell measurement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The aim of this study was to determine the feasibility of a combined supervised and home-based exercise intervention during chemotherapy for women with recurrent ovarian cancer. Secondary aims were to determine the impact of physical activity on physical and psychological outcomes and on chemotherapy completion rates. Methods: Women with recurrent ovarian cancer were recruited from 3 oncology outpatient clinics in Sydney and Canberra, Australia. All participants received an individualized exercise program that consisted of 90 minutes or more of low to moderate aerobic, resistance, core stability, and balance exercise per week, for 12 weeks. Feasibility was determined by recruitment rate, retention rate, intervention adherence, and adverse events. Aerobic capacity, muscular strength, fatigue, sleep quality, quality of life, depression, and chemotherapy completion rates were assessed at weeks 0, 12, and 24. Results: Thirty participants were recruited (recruitment rate, 63%), with a retention rate of 70%. Participants averaged 196 ± 138 min · wk of low to moderate physical activity throughout the intervention, with adherence to the program at 81%. There were no adverse events resulting from the exercise intervention. Participants who completed the study displayed significant improvements in quality of life (P = 0.017), fatigue (P = 0.004), mental health (P = 0.007), muscular strength (P = 0.001), and balance (P = 0.003) after the intervention. Participants completing the intervention had a higher relative dose intensity than noncompleters (P = 0.03). Conclusions: A program consisting of low to moderate exercise of 90 min · wk was achieved by two-thirds of women with recurrent ovarian cancer in this study, with no adverse events reported. Randomized control studies are required to confirm the benefits of exercise reported in this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new technique called the reef resource inventory (RRI) was developed to map the distribution and abundance of benthos and substratum on reefs. The rapid field sampling technique uses divers to visually estimate the percentage cover of categories of benthos and substratum along 2x20 in plotless strip-transects positioned randomly over the tops, and systematically along the edge of reefs. The purpose of this study was to compare the relative sampling accuracy of the RRI against the line intercept transect technique (LIT), an international standard for sampling reef benthos and substratum. Analysis of paired sampling with LIT and RRI at 51 sites indicated sampling accuracy was not different (P > 0.05) for 8 of the 12 benthos and substratum categories used in the study. Significant differences were attributed to small-scale patchiness and cryptic coloration of some benthos; effects associated with sampling a sparsely distributed animal along a line versus an area; difficulties in discriminating some of the benthos and substratum categories; and differences due to visual acuity since LIT measurements were taken by divers close to the seabed whereas RRI measurements were taken by divers higher in the water column. The relative cost efficiency of the RRI technique was at least three times that of LIT for all benthos and substratum categories and as much as 10 times higher for two categories. These results suggest that the RRI can be used to obtain reliable and accurate estimates of relative abundance of broad categories of reef benthos and substratum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Little is known about the neural mechanisms by which transcranial direct current stimulation (tDCS) impacts on language processing in post-stroke aphasia. This was addressed in a proof-of-principle study that explored the effects of tDCS application in aphasia during simultaneous functional magnetic resonance imaging (fMRI). We employed a single subject, cross-over, sham-tDCS controlled design, and the stimulation was administered to an individualized perilesional stimulation site that was identified by a baseline fMRI scan and a picture naming task. Peak activity during the baseline scan was located in the spared left inferior frontal gyrus and this area was stimulated during a subsequent cross-over phase. tDCS was successfully administered to the target region and anodal- vs. sham-tDCS resulted in selectively increased activity at the stimulation site. Our results thus demonstrate that it is feasible to precisely target an individualized stimulation site in aphasia patients during simultaneous fMRI, which allows assessing the neural mechanisms underlying tDCS application. The functional imaging results of this case report highlight one possible mechanism that may have contributed to beneficial behavioral stimulation effects in previous clinical tDCS trials in aphasia. In the future, this approach will allow identifying distinct patterns of stimulation effects on neural processing in larger cohorts of patients. This may ultimately yield information about the variability of tDCS effects on brain functions in aphasia.