900 resultados para work system method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: Partnerships in mental health care, particularly between public and private psychiatric services, are being increasingly recognized as important for optimizing patient management and the efficient organization of services. However, public sector mental health services and private psychiatrists do not always work well together and there seem to be a number of barriers to effective collaboration. This study set out to investigate the extent of collaborative 'shared care' arrangements between a public mental health service and private psychiatrists practising nearby. It also examined possible barriers to collaboration and some possible solutions to the identified problems. Method: A questionnaire examining the above factors was sent to all public sector mental health clinicians and all private psychiatrists in the area. Results: One hundred and five of the 154 (68.2%) public sector clinicians and 103 of the 194 (53.1%) private psychiatrists returned surveys. The main barriers to successful collaboration identified by members of both sectors were: 'Difficulty communicating' endorsed by 71.4% of public clinicians and 72% of private psychiatrists, 'Confusion of roles and responsibilities' endorsed by 62.9% and 66%, respectively, and 'Different treatment approach' by 47.6% and 45.6%, respectively. Over 60% of private psychiatrists identified problems with access to the public system as a barrier to successful shared care arrangements. It also emerged, as hypothesized, that the public and private systems tend to manage different patient populations and that public clinicians in particular are not fully aware of the private psychiatrists' range of expertise. This would result in fewer referrals for shared care across the sectors. Conclusions: A number of barriers to public sector clinicians and private psychiatrists collaborating in shared care arrangements were identified. The two groups surveyed identified similar barriers. Some of these can potentially be addressed by changes to service systems. Others require cultural shifts in both sectors. Improved communications including more opportunities for formal and informal meetings between people working in the two sectors would be likely to improve the understanding of the complementary sector's perspective and practice. Further changes would be expected to require careful work between the sectors on training, employment and practice protocols and initiatives, to allow better use of the existing services and resources.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The country-product-dummy (CPD) method, originally proposed in Summers (1973), has recently been revisited in its weighted formulation to handle a variety of data related situations (Rao and Timmer, 2000, 2003; Heravi et al., 2001; Rao, 2001; Aten and Menezes, 2002; Heston and Aten, 2002; Deaton et al., 2004). The CPD method is also increasingly being used in the context of hedonic modelling instead of its original purpose of filling holes in Summers (1973). However, the CPD method is seen, among practitioners, as a black box due to its regression formulation. The main objective of the paper is to establish equivalence of purchasing power parities and international prices derived from the application of the weighted-CPD method with those arising out of the Rao-system for multilateral comparisons. A major implication of this result is that the weighted-CPD method would then be a natural method of aggregation at all levels of aggregation within the context of international comparisons.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The HMR model contains a mechanism whereby anyone who is concerned about the risk of medication misadventure can request a HMR from the patient's GP. Since nurses are widely involved in a range of triage and gatekeeping roles, utilising their primary care skills to identify patients for a HMR is a logical extension of this role. Furthermore, community nurses visit their clients in the home situation and see many difficulties the client may be experiencing at first hand. They are therefore well placed to request specialist assistance for the client. Blue Care in Brisbane, a community nursing service, approached its local Division of General practice to determine how best to request HMRs for its clients. The Division contacted The University of Queensland which initiated this study to engage the health care team to tailor the established HMR request process to the needs of community nurses and test the system developed. (non-author abstract)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most magnetic resonance imaging (MRI) spatial encoding techniques employ low-frequency pulsed magnetic field gradients that undesirably induce multiexponentially decaying eddy currents in nearby conducting structures of the MRI system. The eddy currents degrade the switching performance of the gradient system, distort the MRI image, and introduce thermal loads in the cryostat vessel and superconducting MRI components. Heating of superconducting magnets due to induced eddy currents is particularly problematic as it offsets the superconducting operating point, which can cause a system quench. A numerical characterization of transient eddy current effects is vital for their compensation/control and further advancement of the MRI technology as a whole. However, transient eddy current calculations are particularly computationally intensive. In large-scale problems, such as gradient switching in MRI, conventional finite-element method (FEM)-based routines impose very large computational loads during generation/solving of the system equations. Therefore, other computational alternatives need to be explored. This paper outlines a three-dimensional finite-difference time-domain (FDTD) method in cylindrical coordinates for the modeling of low-frequency transient eddy currents in MRI, as an extension to the recently proposed time-harmonic scheme. The weakly coupled Maxwell's equations are adapted to the low-frequency regime by downscaling the speed of light constant, which permits the use of larger FDTD time steps while maintaining the validity of the Courant-Friedrich-Levy stability condition. The principal hypothesis of this work is that the modified FDTD routine can be employed to analyze pulsed-gradient-induced, transient eddy currents in superconducting MRI system models. The hypothesis is supported through a verification of the numerical scheme on a canonical problem and by analyzing undesired temporal eddy current effects such as the B-0-shift caused by actively shielded symmetric/asymmetric transverse x-gradient head and unshielded z-gradient whole-body coils operating in proximity to a superconducting MRI magnet.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Traditional vaccines consisting of whole attenuated microorganisms, killed microorganisms, or microbial components, administered with an adjuvant (e.g. alum), have been proved to be extremely successful. However, to develop new vaccines, or to improve upon current vaccines, new vaccine development techniques are required. Peptide vaccines offer the capacity to administer only the minimal microbial components necessary to elicit appropriate immune responses, minimizing the risk of vaccination associated adverse effects, and focusing the immune response toward important antigens. Peptide vaccines, however, are generally poorly immunogenic, necessitating administration with powerful, and potentially toxic adjuvants. The attachment of lipids to peptide antigens has been demonstrated as a potentially safe method for adjuvanting peptide epitopes. The lipid core peptide (LCP) system, which incorporates a lipidic adjuvant, carrier, and peptide epitopes into a single molecular entity, has been demonstrated to boost immunogenicity of attached peptide epitopes without the need for additional adjuvants. The synthesis of LCP systems normally yields a product that cannot be purified to homogeneity. The current study describes the development of methods for the synthesis of highly pure LCP analogs using native chemical ligation. Because of the highly lipophilic nature of the LCP lipid adjuvant, difficulties (e.g. poor solubility) were experienced with the ligation reactions. The addition of organic solvents to the ligation buffer solubilized lipidic species, but did not result in successful ligation reactions. In comparison, the addition of approximately 1% (w/v) sodium dodecyl sulfate (SDS) proved successful, enabling the synthesis of two highly pure, tri-epitopic Streptococcus pyogenes LCP analogs. Subcutaneous immunization of B10.BR (H-2(k)) mice with one of these vaccines, without the addition of any adjuvant, elicited high levels of systemic IgG antibodies against each of the incorporated peptides. Copyright (c) 2006 European Peptide Society and John Wiley & Sons, Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To foster ongoing international cooperation beyond ACES (APEC Cooperation for Earthquake Simulation) on the simulation of solid earth phenomena, agreement was reached to work towards establishment of a frontier international research institute for simulating the solid earth: iSERVO = International Solid Earth Research Virtual Observatory institute (http://www.iservo.edu.au). This paper outlines a key Australian contribution towards the iSERVO institute seed project, this is the construction of: (1) a typical intraplate fault system model using practical fault system data of South Australia (i.e., SA interacting fault model), which includes data management and editing, geometrical modeling and mesh generation; and (2) a finite-element based software tool, which is built on our long-term and ongoing effort to develop the R-minimum strategy based finite-element computational algorithm and software tool for modelling three-dimensional nonlinear frictional contact behavior between multiple deformable bodies with the arbitrarily-shaped contact element strategy. A numerical simulation of the SA fault system is carried out using this software tool to demonstrate its capability and our efforts towards seeding the iSERVO Institute.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A deregulated electricity market is characterized with uncertainties, with both long and short terms. As one of the major long term planning issues, the transmission expansion planning (TEP) is aiming at implementing reliable and secure network support to the market participants. The TEP covers two major issues: technical assessment and financial evaluations. Traditionally, the net present value (NPV) method is the most accepted for financial evaluations, it is simple to conduct and easy to understand. Nevertheless, TEP in a deregulated market needs a more dynamic approach to incorporate a project's management flexibility, or the managerial ability to adapt in response to unpredictable market developments. The real options approach (ROA) is introduced here, which has clear advantage on counting the future course of actions that investors may take, with understandable results in monetary terms. In the case study, a Nordic test system has been testified and several scenarios are given for network expansion planning. Both the technical assessment and financial evaluation have been conducted in the case study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix system. Acknowledgements: Project work is supported by Australian Commonwealth Government through the Australian Computational Earth Systems Simulator Major National Research Facility, Queensland State Government Smart State Research Facility Fund, The University of Queensland and SGI.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Among the Solar System’s bodies, Moon, Mercury and Mars are at present, or have been in the recent years, object of space missions aimed, among other topics, also at improving our knowledge about surface composition. Between the techniques to detect planet’s mineralogical composition, both from remote and close range platforms, visible and near-infrared reflectance (VNIR) spectroscopy is a powerful tool, because crystal field absorption bands are related to particular transitional metals in well-defined crystal structures, e.g., Fe2+ in M1 and M2 sites of olivine or pyroxene (Burns, 1993). Thanks to the improvements in the spectrometers onboard the recent missions, a more detailed interpretation of the planetary surfaces can now be delineated. However, quantitative interpretation of planetary surface mineralogy could not always be a simple task. In fact, several factors such as the mineral chemistry, the presence of different minerals that absorb in a narrow spectral range, the regolith with a variable particle size range, the space weathering, the atmosphere composition etc., act in unpredictable ways on the reflectance spectra on a planetary surface (Serventi et al., 2014). One method for the interpretation of reflectance spectra of unknown materials involves the study of a number of spectra acquired in the laboratory under different conditions, such as different mineral abundances or different particle sizes, in order to derive empirical trends. This is the methodology that has been followed in this PhD thesis: the single factors previously listed have been analyzed, creating, in the laboratory, a set of terrestrial analogues with well-defined composition and size. The aim of this work is to provide new tools and criteria to improve the knowledge of the composition of planetary surfaces. In particular, mixtures composed with different content and chemistry of plagioclase and mafic minerals have been spectroscopically analyzed at different particle sizes and with different mineral relative percentages. The reflectance spectra of each mixture have been analyzed both qualitatively (using the software ORIGIN®) and quantitatively applying the Modified Gaussian Model (MGM, Sunshine et al., 1990) algorithm. In particular, the spectral parameter variations of each absorption band have been evaluated versus the volumetric FeO% content in the PL phase and versus the PL modal abundance. This delineated calibration curves of composition vs. spectral parameters and allow implementation of spectral libraries. Furthermore, the trends derived from terrestrial analogues here analyzed and from analogues in the literature have been applied for the interpretation of hyperspectral images of both plagioclase-rich (Moon) and plagioclase-poor (Mars) bodies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent functional magnetic resonance imaging (fMRI) investigations of the interaction between cognition and reward processing have found that the lateral prefrontal cortex (PFC) areas are preferentially activated to both increasing cognitive demand and reward level. Conversely, ventromedial PFC (VMPFC) areas show decreased activation to the same conditions, indicating a possible reciprocal relationship between cognitive and emotional processing regions. We report an fMRI study of a rewarded working memory task, in which we further explore how the relationship between reward and cognitive processing is mediated. We not only assess the integrity of reciprocal neural connections between the lateral PFC and VMPFC brain regions in different experimental contexts but also test whether additional cortical and subcortical regions influence this relationship. Psychophysiological interaction analyses were used as a measure of functional connectivity in order to characterize the influence of both cognitive and motivational variables on connectivity between the lateral PFC and the VMPFC. Psychophysiological interactions revealed negative functional connectivity between the lateral PFC and the VMPFC in the context of high memory load, and high memory load in tandem with a highly motivating context, but not in the context of reward alone. Physiophysiological interactions further indicated that the dorsal anterior cingulate and the caudate nucleus modulate this pathway. These findings provide evidence for a dynamic interplay between lateral PFC and VMPFC regions and are consistent with an emotional gating role for the VMPFC during cognitively demanding tasks. Our findings also support neuropsychological theories of mood disorders, which have long emphasized a dysfunctional relationship between emotion/motivational and cognitive processes in depression.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present, for the first time to our knowledge, experimental evidence showing that superimposed blazed fiber Bragg gratings may be fabricated and used to extend the dynamic range of a grating-based spectrometer. Blazed gratings of 4° and 8° were superimposed in germanosilicate fiber by ultraviolet inscription and used in conjunction with a coated charged-coupled device array to interrogate a wavelength-division-multiplexing sensor array. We show that the system can be used to monitor strain and temperature sensors simultaneously with an employable bandwidth which is extendable to 70 nm.