961 resultados para Isotropic and Anisotropic models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analytical and computational models of the intervertebral disc (IVD) are commonly employed to enhance understanding of the biomechanics of the human spine and spinal motion segments. The accuracy of these models in predicting physiological behaviour of the spine is intrinsically reliant on the accuracy of the material constitutive representations employed to represent the spinal tissues. There is a paucity of detailed mechanical data describing the material response of the reinforced­ground matrix in the anulus fibrosus of the IVD. In the present study, the ‘reinforced­ground matrix’ was defined as the matrix with the collagen fibres embedded but not actively bearing axial load, thus incorporating the contribution of the fibre-fibre and fibre-matrix interactions. To determine mechanical parameters for the anulus ground matrix, mechanical tests were carried out on specimens of ovine anulus, under unconfined uniaxial compression, simple shear and biaxial compression. Test specimens of ovine anulus fibrosus were obtained with an adjacent layer of vertebral bone/cartilage on the superior and inferior specimen surface. Specimen geometry was such that there were no continuous collagen fibres coupling the two endplates. Samples were subdivided according to disc region - anterior, lateral and posterior - to determine the regional inhomogeneity in the anulus mechanical response. Specimens were loaded at a strain rate sufficient to avoid fluid outflow from the tissue and typical stress-strain responses under the initial load application and under repeated loading were determined for each of the three loading types. The response of the anulus tissue to the initial and repeated load cycles was significantly different for all load types, except biaxial compression in the anterior anulus. Since the maximum applied strain exceeded the damage strain for the tissue, experimental results for repeated loading reflected the mechanical ability of the tissue to carry load, subsequent to the initiation of damage. To our knowledge, this is the first study to provide experimental data describing the response of the ‘reinforced­ground matrix’ to biaxial compression. Additionally, it is novel in defining a study objective to determine the regionally inhomogeneous response of the ‘reinforced­ground matrix’ under an extensive range of loading conditions suitable for mechanical characterisation of the tissue. The results presented facilitate the development of more detailed and comprehensive constitutive descriptions for the large strain nonlinear elastic or hyperelastic response of the anulus ground matrix.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to contribute to the understanding of various models used in research for the adoption and diffusion of information technology in small and medium-sized enterprises (SMEs). Starting with Rogers' diffusion theory and behavioural models, technology adoption models used in IS research are discussed. Empirical research has shown that the reasons why firms choose to adopt or not adopt technology is dependent on a number of factors. These factors can be categorised as owner/manager characteristics, firm characteristics and other characteristics. The existing models explaining IS diffusion and adoption by SMEs overlap and complement each other. This paper reviews the existing literature and proposes a comprehensive model which includes the whole array of variables from earlier models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Australian screen industries are a leading domestic creative industry sector at a crossroad. New production, distribution and exhibition technologies are challenging traditional models of ‘filmmaking’. For the screen industries to remain competitive they must renovate business models for an emerging marketplace. This paper is a preliminary examination of three key aspects of next generation filmmaking: post-cinema approaches to screen production, emerging production and business models, and issues for policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fast thrust changes are important for authoritive control of VTOL micro air vehicles. Fixed-pitch rotors that alter thrust by varying rotor speed require high-bandwidth control systems to provide adequate performace. We develop a feedback compensator for a brushless hobby motor driving a custom rotor suitable for UAVs. The system plant is identified using step excitation experiments. The aerodynamic operating conditions of these rotors are unusual and so experiments are performed to characterise expected load disturbances. The plant and load models lead to a proportional controller design capable of significantly decreasing rise-time and propagation of disturbances, subject to bus voltage constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1960s, the value relevance of accounting information has been an important topic in accounting research. The value relevance research provides evidence as to whether accounting numbers relate to corporate value in a predicted manner (Beaver, 2002). Such research is not only important for investors but also provides useful insights into accounting reporting effectiveness for standard setters and other users. Both the quality of accounting standards used and the effectiveness associated with implementing these standards are fundamental prerequisites for high value relevance (Hellstrom, 2006). However, while the literature comprehensively documents the value relevance of accounting information in developed markets, little attention has been given to emerging markets where the quality of accounting standards and their enforcement are questionable. Moreover, there is currently no known research that explores the association between level of compliance with International Financial Reporting Standards (IFRS) and the value relevance of accounting information. Motivated by the lack of research on the value relevance of accounting information in emerging markets and the unique institutional setting in Kuwait, this study has three objectives. First, it investigates the extent of compliance with IFRS with respect to firms listed on the Kuwait Stock Exchange (KSE). Second, it examines the value relevance of accounting information produced by KSE-listed firms over the 1995 to 2006 period. The third objective links the first two and explores the association between the level of compliance with IFRS and the value relevance of accounting information to market participants. Since it is among the first countries to adopt IFRS, Kuwait provides an ideal setting in which to explore these objectives. In addition, the Kuwaiti accounting environment provides an interesting regulatory context in which each KSE-listed firm is required to appoint at least two external auditors from separate auditing firms. Based on the research objectives, five research questions (RQs) are addressed. RQ1 and RQ2 aim to determine the extent to which KSE-listed firms comply with IFRS and factors contributing to variations in compliance levels. These factors include firm attributes (firm age, leverage, size, profitability, liquidity), the number of brand name (Big-4) auditing firms auditing a firm’s financial statements, and industry categorization. RQ3 and RQ4 address the value relevance of IFRS-based financial statements to investors. RQ5 addresses whether the level of compliance with IFRS contributes to the value relevance of accounting information provided to investors. Based on the potential improvement in value relevance from adopting and complying with IFRS, it is predicted that the higher the level of compliance with IFRS, the greater the value relevance of book values and earnings. The research design of the study consists of two parts. First, in accordance with prior disclosure research, the level of compliance with mandatory IFRS is examined using a disclosure index. Second, the value relevance of financial statement information, specifically, earnings and book value, is examined empirically using two valuation models: price and returns models. The combined empirical evidence that results from the application of both models provides comprehensive insights into value relevance of accounting information in an emerging market setting. Consistent with expectations, the results show the average level of compliance with IFRS mandatory disclosures for all KSE-listed firms in 2006 was 72.6 percent; thus, indicating KSE-listed firms generally did not fully comply with all requirements. Significant variations in the extent of compliance are observed among firms and across accounting standards. As predicted, older, highly leveraged, larger, and profitable KSE-listed firms are more likely to comply with IFRS required disclosures. Interestingly, significant differences in the level of compliance are observed across the three possible auditor combinations of two Big-4, two non-Big 4, and mixed audit firm types. The results for the price and returns models provide evidence that earnings and book values are significant factors in the valuation of KSE-listed firms during the 1995 to 2006 period. However, the results show that the value relevance of earnings and book values decreased significantly during that period, suggesting that investors rely less on financial statements, possibly due to the increase in the available non-financial statement sources. Notwithstanding this decline, a significant association is observed between the level of compliance with IFRS and the value relevance of earnings and book value to KSE investors. The findings make several important contributions. First, they raise concerns about the effectiveness of the regulatory body that oversees compliance with IFRS in Kuwait. Second, they challenge the effectiveness of the two-auditor requirement in promoting compliance with regulations as well as the associated cost-benefit of this requirement for firms. Third, they provide the first known empirical evidence linking the level of IFRS compliance with the value relevance of financial statement information. Finally, the findings are relevant for standard setters and for their current review of KSE regulations. In particular, they highlight the importance of establishing and maintaining adequate monitoring and enforcement mechanisms to ensure compliance with accounting standards. In addition, the finding that stricter compliance with IFRS improves the value relevance of accounting information highlights the importance of full compliance with IFRS and not just mere adoption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Micropolar and RNG-based modelling of industrially relevant boundary layer and recirculating swirling flows is described. Both models contain a number of adjustable parameters and auxiliary conditions that must be either modelled or experimentally determined, and the effects of varying these on the resulting flow solutions is quantified. To these ends, the behaviour of the micropolar model for self-similar flow over a surface that is both stretching and transpiring is explored in depth. The simplified governing equations permit both analytic and numerical approaches to be adopted, and a number of closed form solutions (both exact and approximate) are obtained using perturbation and order of magnitude analyses. Results are compared with the corresponding Newtonian flow solution in order to highlight the differences between the micropolar and classical models, and significant new insights into the behaviour of the micropolar model are revealed for this flow. The behaviour of the RNG-bas based models for swirling flow with vortex breakdown zones is explored in depth via computational modelling of two experimental data sets and an idealised breakdown flow configuration. Meticulous modeling of upstream auxillary conditions is required to correctly assess the behavior of the models studied in this work. The novel concept of using the results to infer the role of turbulence in the onset and topology of the breakdown zone is employed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Mount Isa Basin is a new concept used to describe the area of Palaeo- to Mesoproterozoic rocks south of the Murphy Inlier and inappropriately described presently as the Mount Isa Inlier. The new basin concept presented in this thesis allows for the characterisation of basin-wide structural deformation, correlation of mineralisation with particular lithostratigraphic and seismic stratigraphic packages, and the recognition of areas with petroleum exploration potential. The northern depositional margin of the Mount Isa Basin is the metamorphic, intrusive and volcanic complex here referred to as the Murphy Inlier (not the "Murphy Tectonic Ridge"). The eastern, southern and western boundaries of the basin are obscured by younger basins (Carpentaria, Eromanga and Georgina Basins). The Murphy Inlier rocks comprise the seismic basement to the Mount Isa Basin sequence. Evidence for the continuity of the Mount Isa Basin with the McArthur Basin to the northwest and the Willyama Block (Basin) at Broken Hill to the south is presented. These areas combined with several other areas of similar age are believed to have comprised the Carpentarian Superbasin (new term). The application of seismic exploration within Authority to Prospect (ATP) 423P at the northern margin of the basin was critical to the recognition and definition of the Mount Isa Basin. The Mount Isa Basin is structurally analogous to the Palaeozoic Arkoma Basin of Illinois and Arkansas in southern USA but, as with all basins it contains unique characteristics, a function of its individual development history. The Mount Isa Basin evolved in a manner similar to many well described, Phanerozoic plate tectonic driven basins. A full Wilson Cycle is recognised and a plate tectonic model proposed. The northern Mount Isa Basin is defined as the Proterozoic basin area northwest of the Mount Gordon Fault. Deposition in the northern Mount Isa Basin began with a rift sequence of volcaniclastic sediments followed by a passive margin drift phase comprising mostly carbonate rocks. Following the rift and drift phases, major north-south compression produced east-west thrusting in the south of the basin inverting the older sequences. This compression produced an asymmetric epi- or intra-cratonic clastic dominated peripheral foreland basin provenanced in the south and thinning markedly to a stable platform area (the Murphy Inlier) in the north. The fmal major deformation comprised east-west compression producing north-south aligned faults that are particularly prominent at Mount Isa. Potential field studies of the northern Mount Isa Basin, principally using magnetic data (and to a lesser extent gravity data, satellite images and aerial photographs) exhibit remarkable correlation with the reflection seismic data. The potential field data contributed significantly to the unravelling of the northern Mount Isa Basin architecture and deformation. Structurally, the Mount Isa Basin consists of three distinct regions. From the north to the south they are the Bowthorn Block, the Riversleigh Fold Zone and the Cloncurry Orogen (new names). The Bowthom Block, which is located between the Elizabeth Creek Thrust Zone and the Murphy Inlier, consists of an asymmetric wedge of volcanic, carbonate and clastic rocks. It ranges from over 10 000 m stratigraphic thickness in the south to less than 2000 min the north. The Bowthorn Block is relatively undeformed: however, it contains a series of reverse faults trending east-west that are interpreted from seismic data to be down-to-the-north normal faults that have been reactivated as thrusts. The Riversleigh Fold Zone is a folded and faulted region south of the Bowthorn Block, comprising much of the area formerly referred to as the Lawn Hill Platform. The Cloncurry Orogen consists of the area and sequences equivalent to the former Mount Isa Orogen. The name Cloncurry Orogen clearly distinguishes this area from the wider concept of the Mount Isa Basin. The South Nicholson Group and its probable correlatives, the Pilpah Sandstone and Quamby Conglomerate, comprise a later phase of now largely eroded deposits within the Mount Isa Basin. The name South Nicholson Basin is now outmoded as this terminology only applied to the South Nicholson Group unlike the original broader definition in Brown et al. (1968). Cored slimhole stratigraphic and mineral wells drilled by Amoco, Esso, Elf Aquitaine and Carpentaria Exploration prior to 1986, penetrated much of the stratigraphy and intersected both minor oil and gas shows plus excellent potential source rocks. The raw data were reinterpreted and augmented with seismic stratigraphy and source rock data from resampled mineral and petroleum stratigraphic exploration wells for this study. Since 1986, Comalco Aluminium Limited, as operator of a joint venture with Monument Resources Australia Limited and Bridge Oil Limited, recorded approximately 1000 km of reflection seismic data within the basin and drilled one conventional stratigraphic petroleum well, Beamesbrook-1. This work was the first reflection seismic and first conventional petroleum test of the northern Mount Isa Basin. When incorporated into the newly developed foreland basin and maturity models, a grass roots petroleum exploration play was recognised and this led to the present thesis. The Mount Isa Basin was seen to contain excellent source rocks coupled with potential reservoirs and all of the other essential aspects of a conventional petroleum exploration play. This play, although high risk, was commensurate with the enormous and totally untested petroleum potential of the basin. The basin was assessed for hydrocarbons in 1992 with three conventional exploration wells, Desert Creek-1, Argyle Creek-1 and Egilabria-1. These wells also tested and confrrmed the proposed basin model. No commercially viable oil or gas was encountered although evidence of its former existence was found. In addition to the petroleum exploration, indeed as a consequence of it, the association of the extensive base metal and other mineralisation in the Mount Isa Basin with hydrocarbons could not be overlooked. A comprehensive analysis of the available data suggests a link between the migration and possible generation or destruction of hydrocarbons and metal bearing fluids. Consequently, base metal exploration based on hydrocarbon exploration concepts is probably. the most effective technique in such basins. The metal-hydrocarbon-sedimentary basin-plate tectonic association (analogous to Phanerozoic models) is a compelling outcome of this work on the Palaeo- to Mesoproterozoic Mount lsa Basin. Petroleum within the Bowthom Block was apparently destroyed by hot brines that produced many ore deposits elsewhere in the basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many large coal mining operations in Australia rely heavily on the rail network to transport coal from mines to coal terminals at ports for shipment. Over the last few years, due to the fast growing demand, the coal rail network is becoming one of the worst industrial bottlenecks in Australia. As a result, this provides great incentives for pursuing better optimisation and control strategies for the operation of the whole rail transportation system under network and terminal capacity constraints. This PhD research aims to achieve a significant efficiency improvement in a coal rail network on the basis of the development of standard modelling approaches and generic solution techniques. Generally, the train scheduling problem can be modelled as a Blocking Parallel- Machine Job-Shop Scheduling (BPMJSS) problem. In a BPMJSS model for train scheduling, trains and sections respectively are synonymous with jobs and machines and an operation is regarded as the movement/traversal of a train across a section. To begin, an improved shifting bottleneck procedure algorithm combined with metaheuristics has been developed to efficiently solve the Parallel-Machine Job- Shop Scheduling (PMJSS) problems without the blocking conditions. Due to the lack of buffer space, the real-life train scheduling should consider blocking or hold-while-wait constraints, which means that a track section cannot release and must hold a train until the next section on the routing becomes available. As a consequence, the problem has been considered as BPMJSS with the blocking conditions. To develop efficient solution techniques for BPMJSS, extensive studies on the nonclassical scheduling problems regarding the various buffer conditions (i.e. blocking, no-wait, limited-buffer, unlimited-buffer and combined-buffer) have been done. In this procedure, an alternative graph as an extension of the classical disjunctive graph is developed and specially designed for the non-classical scheduling problems such as the blocking flow-shop scheduling (BFSS), no-wait flow-shop scheduling (NWFSS), and blocking job-shop scheduling (BJSS) problems. By exploring the blocking characteristics based on the alternative graph, a new algorithm called the topological-sequence algorithm is developed for solving the non-classical scheduling problems. To indicate the preeminence of the proposed algorithm, we compare it with two known algorithms (i.e. Recursive Procedure and Directed Graph) in the literature. Moreover, we define a new type of non-classical scheduling problem, called combined-buffer flow-shop scheduling (CBFSS), which covers four extreme cases: the classical FSS (FSS) with infinite buffer, the blocking FSS (BFSS) with no buffer, the no-wait FSS (NWFSS) and the limited-buffer FSS (LBFSS). After exploring the structural properties of CBFSS, we propose an innovative constructive algorithm named the LK algorithm to construct the feasible CBFSS schedule. Detailed numerical illustrations for the various cases are presented and analysed. By adjusting only the attributes in the data input, the proposed LK algorithm is generic and enables the construction of the feasible schedules for many types of non-classical scheduling problems with different buffer constraints. Inspired by the shifting bottleneck procedure algorithm for PMJSS and characteristic analysis based on the alternative graph for non-classical scheduling problems, a new constructive algorithm called the Feasibility Satisfaction Procedure (FSP) is proposed to obtain the feasible BPMJSS solution. A real-world train scheduling case is used for illustrating and comparing the PMJSS and BPMJSS models. Some real-life applications including considering the train length, upgrading the track sections, accelerating a tardy train and changing the bottleneck sections are discussed. Furthermore, the BPMJSS model is generalised to be a No-Wait Blocking Parallel- Machine Job-Shop Scheduling (NWBPMJSS) problem for scheduling the trains with priorities, in which prioritised trains such as express passenger trains are considered simultaneously with non-prioritised trains such as freight trains. In this case, no-wait conditions, which are more restrictive constraints than blocking constraints, arise when considering the prioritised trains that should traverse continuously without any interruption or any unplanned pauses because of the high cost of waiting during travel. In comparison, non-prioritised trains are allowed to enter the next section immediately if possible or to remain in a section until the next section on the routing becomes available. Based on the FSP algorithm, a more generic algorithm called the SE algorithm is developed to solve a class of train scheduling problems in terms of different conditions in train scheduling environments. To construct the feasible train schedule, the proposed SE algorithm consists of many individual modules including the feasibility-satisfaction procedure, time-determination procedure, tune-up procedure and conflict-resolve procedure algorithms. To find a good train schedule, a two-stage hybrid heuristic algorithm called the SE-BIH algorithm is developed by combining the constructive heuristic (i.e. the SE algorithm) and the local-search heuristic (i.e. the Best-Insertion- Heuristic algorithm). To optimise the train schedule, a three-stage algorithm called the SE-BIH-TS algorithm is developed by combining the tabu search (TS) metaheuristic with the SE-BIH algorithm. Finally, a case study is performed for a complex real-world coal rail network under network and terminal capacity constraints. The computational results validate that the proposed methodology would be very promising because it can be applied as a fundamental tool for modelling and solving many real-world scheduling problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patients undergoing radiation therapy for cancer face a series of challenges that require support from a multidisciplinary team which includes radiation oncology nurses. However, the specific contribution of nursing, and the models of care that best support the delivery of nursing interventions in the radiotherapy setting, is not well described. In this case study, the Interaction Model of Client Health Behaviour and the associated principles of person-centred care were incorporated into a new model of care that was implemented in one radiation oncology setting in Brisbane, Australia. The new model of care was operationalised through a Primary Nursing/Collaborative Practice framework. To evaluate the impact of the new model for patients and health professionals, multiple sources of data were collected from patients and clinical staff prior to, during, and 18 months following introduction of the practice redesign. One cohort of patients and clinical staff completed surveys incorporating measures of key outcomes immediately prior to implementation of the model, while a second cohort of patients and clinical staff completed these same surveys 18 months following introduction of the model. In-depth interviews were also conducted with nursing, medical and allied health staff throughout the implementation phase to obtain a more comprehensive account of the processes and outcomes associated with implementing such a model. From the patients’ perspectives, this study demonstrated that, although adverse effects of radiotherapy continue to affect patient well-being, patients continue to be satisfied with nursing care in this specialty, and that they generally reported high levels of functioning despite undergoing a curative course of radiotherapy. From the health professionals’ perspective, there was evidence of attitudinal change by nursing staff within the radiotherapy department which reflected a greater understanding and appreciation of a more person-centred approach to care. Importantly, this case study has also confirmed that a range of factors need to be considered when redesigning nursing practice in the radiotherapy setting, as the challenges associated with changing traditional practices, ensuring multidisciplinary approaches to care, and resourcing a new model were experienced. The findings from this study suggest that the move from a relatively functional approach to a person-centred approach in the radiotherapy setting has contributed to some improvements in the provision of individualised and coordinated patient care. However, this study has also highlighted that primary nursing may be limited in its approach as a framework for patient care unless it is supported by a whole team approach, an appropriate supportive governance model, and sufficient resourcing. Introducing such a model thus requires effective education, preparation and ongoing support for the whole team. The challenges of providing care in the context of complex interdisciplinary relationships have been highlighted by this study. Aspects of this study may assist in planning further nursing interventions for patients undergoing radiotherapy for cancer, and continue to enhance the contribution of the radiation oncology nurse to improved patient outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Leisure-time physical activity (LTPA) shows promise for reducing the risk of poor mental health in later life, although gender- and age-specific research is required to clarify this association. This study examined the concurrent and prospective relationships between both LTPA and walking with mental health in older women. Methods Community-dwelling women aged 73–78 years completed mailed surveys in 1999, 2002 and 2005 for the Australian Longitudinal Study on Women's Health. Respondents reported their weekly minutes of walking, moderate LTPA and vigorous LTPA. Mental health was defined as the number of depression and anxiety symptoms, as assessed with the Goldberg Anxiety and Depression Scale (GADS). Multivariable linear mixed models, adjusted for socio-demographic and health-related variables, were used to examine associations between five levels of LTPA (none, very low, low, intermediate and high) and GADS scores. For women who reported walking as their only LTPA, associations between walking and GADS scores were also examined. Women who reported depression or anxiety in 1999 were excluded, resulting in data from 6653 women being included in these analyses. Results Inverse dose–response associations were observed between both LTPA and walking with GADS scores in concurrent and prospective models (p<0.001). Even low levels of LTPA and walking were associated with lowered scores. The lowest scores were observed in women reporting high levels of LTPA or walking. Conclusion The results support an inverse dose–response association between both LTPA and walking with mental health, over 3 years in older women without depression or anxiety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability has been increasingly recognised as an integral part of highway infrastructure development. In practice however, the fact that financial return is still a project’s top priority for many, environmental aspects tend to be overlooked or considered as a burden, as they add to project costs. Sustainability and its implications have a far-reaching effect on each project over time. Therefore, with highway infrastructure’s long-term life span and huge capital demand, the consideration of environmental cost/ benefit issues is more crucial in life-cycle cost analysis (LCCA). To date, there is little in existing literature studies on viable estimation methods for environmental costs. This situation presents the potential for focused studies on environmental costs and issues in the context of life-cycle cost analysis. This paper discusses a research project which aims to integrate the environmental cost elements and issues into a conceptual framework for life cycle costing analysis for highway projects. Cost elements and issues concerning the environment were first identified through literature. Through questionnaires, these environmental cost elements will be validated by practitioners before their consolidation into the extension of existing and worked models of life-cycle costing analysis (LCCA). A holistic decision support framework is being developed to assist highway infrastructure stakeholders to evaluate their investment decision. This will generate financial returns while maximising environmental benefits and sustainability outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crash prediction models are used for a variety of purposes including forecasting the expected future performance of various transportation system segments with similar traits. The influence of intersection features on safety have been examined extensively because intersections experience a relatively large proportion of motor vehicle conflicts and crashes compared to other segments in the transportation system. The effects of left-turn lanes at intersections in particular have seen mixed results in the literature. Some researchers have found that left-turn lanes are beneficial to safety while others have reported detrimental effects on safety. This inconsistency is not surprising given that the installation of left-turn lanes is often endogenous, that is, influenced by crash counts and/or traffic volumes. Endogeneity creates problems in econometric and statistical models and is likely to account for the inconsistencies reported in the literature. This paper reports on a limited-information maximum likelihood (LIML) estimation approach to compensate for endogeneity between left-turn lane presence and angle crashes. The effects of endogeneity are mitigated using the approach, revealing the unbiased effect of left-turn lanes on crash frequency for a dataset of Georgia intersections. The research shows that without accounting for endogeneity, left-turn lanes ‘appear’ to contribute to crashes; however, when endogeneity is accounted for in the model, left-turn lanes reduce angle crash frequencies as expected by engineering judgment. Other endogenous variables may lurk in crash models as well, suggesting that the method may be used to correct simultaneity problems with other variables and in other transportation modeling contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a seminal data mining article, Leo Breiman [1] argued that to develop effective predictive classification and regression models, we need to move away from the sole dependency on statistical algorithms and embrace a wider toolkit of modeling algorithms that include data mining procedures. Nevertheless, many researchers still rely solely on statistical procedures when undertaking data modeling tasks; the sole reliance on these procedures has lead to the development of irrelevant theory and questionable research conclusions ([1], p.199). We will outline initiatives that the HPC & Research Support group is undertaking to engage researchers with data mining tools and techniques; including a new range of seminars, workshops, and one-on-one consultations covering data mining algorithms, the relationship between data mining and the research cycle, and limitations and problems with these new algorithms. Organisational limitations and restrictions to these initiatives are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Around the world, particularly in North America and Australia, urban sprawl combined with low density suburban development has caused serious accessibility and mobility problems, especially for those who do not own a motor vehicle or have access to public transportation services. Sustainable urban and transportation development is seen crucial in solving transportation disadvantage problems in urban settlements. However, current urban and transportation models have not been adequately addressed unsustainable urban transportation problems that transportation disadvantaged groups overwhelmingly encounter, and the negative impacts on the disadvantaged have not been effectively considered. Transportation disadvantaged is a multi-dimensional problem that combines demographic, spatial and transportation service dimensions. Nevertheless, most transportation models focusing on transportation disadvantage only employ demographic and transportation service dimensions and do not take spatial dimension into account. This paper aims to investigate the link between sustainable urban and transportation development and spatial dimension of the transportation disadvantage problem. The paper, for that purpose, provides a thorough review of the literature and identifies a set of urban, development and policy characteristics to define spatial dimension of the transportation disadvantage problem. This paper presents an overview of these urban, development and policy characteristics that have significant relationships with sustainable urban and transportation development and travel inability, which are also useful in determining transportation disadvantaged populations.