829 resultados para Two Approaches


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To understand differences in the managerial ethical decision-making styles of Australian healthcare managers through the exploratory use of the Managerial Ethical Profiles (MEP) Scale. Background Healthcare managers (doctors, nurses, allied health practitioners and non-clinically trained professionals) are faced with a raft of variables when making decisions within the workplace. In the absence of clear protocols and policies healthcare managers rely on a range of personal experiences, personal ethical philosophies, personal factors and organizational factors to arrive at a decision. Understanding the dominant approaches to managerial ethical decision-making, particularly for clinically trained healthcare managers, is a fundamental step in both increasing awareness of the importance of how managers make decisions, but also as a basis for ongoing development of healthcare managers. Design Cross-sectional. Methods The study adopts a taxonomic approach that simultaneously considers multiple ethical factors that potentially influence managerial ethical decision-making. These factors are used as inputs into cluster analysis to identify distinct patterns of influence on managerial ethical decision-making. Results Data analysis from the participants (n=441) showed a similar spread of the five managerial ethical profiles (Knights, Guardian Angels, Duty Followers, Defenders and Chameleons) across clinically trained and non-clinically trained healthcare managers. There was no substantial statistical difference between the two manager types (clinical and non-clinical) across the five profiles. Conclusion This paper demonstrated that managers that came from clinical backgrounds have similar ethical decision-making profiles to non-clinically trained managers. This is an important finding in terms of manager development and how organisations understand the various approaches of managerial decision-making across the different ethical profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adequacy and efficiency of existing legal and regulatory frameworks dealing with corporate phoenix activity have been repeatedly called into question over the past two decades through various reviews, inquiries, targeted regulatory operations and the implementation of piecemeal legislative reform. Despite these efforts, phoenix activity does not appear to have abated. While there is no law in Australia that declares ‘phoenix activity’ to be illegal, the behaviour that tends to manifest in phoenix activity can be capable of transgressing a vast array of law, including for example, corporate law, tax law, and employment law. This paper explores the notion that the persistence of phoenix activity despite the sheer extent of this law suggests that the law is not acting as powerfully as it might as a deterrent. Economic theories of entrepreneurship and innovation can to some extent explain why this is the case and also offer a sound basis for the evaluation and reconsideration of the existing law. The challenges facing key regulators are significant. Phoenix activity is not limited to particular corporate demographic: it occurs in SMEs, large companies and in corporate groups. The range of behaviour that can amount to phoenix activity is so broad, that not all phoenix activity is illegal. This paper will consider regulatory approaches to these challenges via analysis of approaches to detection and enforcement of the underlying law capturing illegal phoenix activity. Remedying the mischief of phoenix activity is of practical importance. The benefits include continued confidence in our economy, law that inspires best practice among directors, and law that is articulated in a manner such that penalties act as a sufficient deterrent and the regulatory system is able to detect offenders and bring them to account. Any further reforms must accommodate and tolerate legal phoenix activity, at least to some extent. Even then, phoenix activity pushes tolerance of repeated entrepreneurial failure to its absolute limit. The more limited liability is misused and abused, the stronger the argument to place some restrictions on access to limited liability. This paper proposes that such an approach is a legitimate next step for a robust and mature capitalist economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically, two-dimensional (2D) cell culture has been the preferred method of producing disease models in vitro. Recently, there has been a move away from 2D culture in favor of generating three-dimensional (3D) multicellular structures, which are thought to be more representative of the in vivo environment. This transition has brought with it an influx of technologies capable of producing these structures in various ways. However, it is becoming evident that many of these technologies do not perform well in automated in vitro drug discovery units. We believe that this is a result of their incompatibility with high-throughput screening (HTS). In this study, we review a number of technologies, which are currently available for producing in vitro 3D disease models. We assess their amenability with high-content screening and HTS and highlight our own work in attempting to address many of the practical problems that are hampering the successful deployment of 3D cell systems in mainstream research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines how teams and teamwork research have been conceptualised in the fields of sport psychology and organizational psychology. Specifically, it provides a close inspection of the general theoretical assumptions that inhere in the two disciplines. The results of a discursive analysis of research literature suggest that the fields have significantly different ways of conceptualising teams and teamwork and that conceptual borrowing may prove fruitful. A key argument is however, that in order for meaningful cross-fertilisation to take place a sound understanding of these differences is necessary. Working from this premise, the essential differences between sport and organizational approaches to teams are outlined. The paper is concluded with a discussion of contributions that organizational psychology can make to understandings of sport-oriented teams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major reason for the global decline of biodiversity is habitat loss and fragmentation. Conservation areas can be designed to reduce biodiversity loss, but as resources are limited, conservation efforts need to be prioritized in order to achieve best possible outcomes. The field of systematic conservation planning developed as a response to opportunistic approaches to conservation that often resulted in biased representation of biological diversity. The last two decades have seen the development of increasingly sophisticated methods that account for information about biodiversity conservation goals (benefits), economical considerations (costs) and socio-political constraints. In this thesis I focus on two general topics related to systematic conservation planning. First, I address two aspects of the question about how biodiversity features should be valued. (i) I investigate the extremely important but often neglected issue of differential prioritization of species for conservation. Species prioritization can be based on various criteria, and is always goal-dependent, but can also be implemented in a scientifically more rigorous way than what is the usual practice. (ii) I introduce a novel framework for conservation prioritization, which is based on continuous benefit functions that convert increasing levels of biodiversity feature representation to increasing conservation value using the principle that more is better. Traditional target-based systematic conservation planning is a special case of this approach, in which a step function is used for the benefit function. We have further expanded the benefit function framework for area prioritization to address issues such as protected area size and habitat vulnerability. In the second part of the thesis I address the application of community level modelling strategies to conservation prioritization. One of the most serious issues in systematic conservation planning currently is not the deficiency of methodology for selection and design, but simply the lack of data. Community level modelling offers a surrogate strategy that makes conservation planning more feasible in data poor regions. We have reviewed the available community-level approaches to conservation planning. These range from simplistic classification techniques to sophisticated modelling and selection strategies. We have also developed a general and novel community level approach to conservation prioritization that significantly improves on methods that were available before. This thesis introduces further degrees of realism into conservation planning methodology. The benefit function -based conservation prioritization framework largely circumvents the problematic phase of target setting, and allowing for trade-offs between species representation provides a more flexible and hopefully more attractive approach to conservation practitioners. The community-level approach seems highly promising and should prove valuable for conservation planning especially in data poor regions. Future work should focus on integrating prioritization methods to deal with multiple aspects in combination influencing the prioritization process, and further testing and refining the community level strategies using real, large datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives This efficacy study assessed the added impact real time computer prompts had on a participatory approach to reduce occupational sedentary exposure and increase physical activity. Design Quasi-experimental. Methods 57 Australian office workers (mean [SD]; age = 47 [11] years; BMI = 28 [5] kg/m2; 46 men) generated a menu of 20 occupational ‘sit less and move more’ strategies through participatory workshops, and were then tasked with implementing strategies for five months (July–November 2014). During implementation, a sub-sample of workers (n = 24) used a chair sensor/software package (Sitting Pad) that gave real time prompts to interrupt desk sitting. Baseline and intervention sedentary behaviour and physical activity (GENEActiv accelerometer; mean work time percentages), and minutes spent sitting at desks (Sitting Pad; mean total time and longest bout) were compared between non-prompt and prompt workers using a two-way ANOVA. Results Workers spent close to three quarters of their work time sedentary, mostly sitting at desks (mean [SD]; total desk sitting time = 371 [71] min/day; longest bout spent desk sitting = 104 [43] min/day). Intervention effects were four times greater in workers who used real time computer prompts (8% decrease in work time sedentary behaviour and increase in light intensity physical activity; p < 0.01). Respective mean differences between baseline and intervention total time spent sitting at desks, and the longest bout spent desk sitting, were 23 and 32 min/day lower in prompt than in non-prompt workers (p < 0.01). Conclusions In this sample of office workers, real time computer prompts facilitated the impact of a participatory approach on reductions in occupational sedentary exposure, and increases in physical activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is demonstrated that the titled reactions are best carried out at high concentrations, as indicated by mechanistic considerations: the observed high reaction orders and the possibility that the Cannizzaro reaction is driven by the hydrophobic effect, which effects proximity between the two molecules of the aldehyde reactant. The present studies have led to improved conditions, simplified workup, and excellent yields of products. The Tishchenko reaction converted benzaldehyde to benzyl benzoate with catalytic NaOMe/tetrahydrafuran in good yield, which is apparently unprecedented for this product of high commercial value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The Researching Effective Approaches to Cleaning in Hospitals (REACH) study will generate evidence about the effectiveness and cost-effectiveness of a novel cleaning initiative that aims to improve the environmental cleanliness of hospitals. The initiative is an environmental cleaning bundle, with five interdependent, evidence-based components (training, technique, product, audit and communication) implemented with environmental services staff to enhance hospital cleaning practices. Methods/design The REACH study will use a stepped-wedge randomised controlled design to test the study intervention, an environmental cleaning bundle, in 11 Australian hospitals. All trial hospitals will receive the intervention and act as their own control, with analysis undertaken of the change within each hospital based on data collected in the control and intervention periods. Each site will be randomised to one of the 11 intervention timings with staggered commencement dates in 2016 and an intervention period between 20 and 50 weeks. All sites complete the trial at the same time in 2017. The inclusion criteria allow for a purposive sample of both public and private hospitals that have higher-risk patient populations for healthcare-associated infections (HAIs). The primary outcome (objective one) is the monthly number of Staphylococcus aureus bacteraemias (SABs), Clostridium difficile infections (CDIs) and vancomycin resistant enterococci (VRE) infections, per 10,000 bed days. Secondary outcomes for objective one include the thoroughness of hospital cleaning assessed using fluorescent marker technology, the bio-burden of frequent touch surfaces post cleaning and changes in staff knowledge and attitudes about environmental cleaning. A cost-effectiveness analysis will determine the second key outcome (objective two): the incremental cost-effectiveness ratio from implementation of the cleaning bundle. The study uses the integrated Promoting Action on Research Implementation in Health Services (iPARIHS) framework to support the tailored implementation of the environmental cleaning bundle in each hospital. Discussion Evidence from the REACH trial will contribute to future policy and practice guidelines about hospital environmental cleaning. It will be used by healthcare leaders and clinicians to inform decision-making and implementation of best-practice infection prevention strategies to reduce HAIs in hospitals. Trial registration Australia New Zealand Clinical Trial Registry ACTRN12615000325​505

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of reconstruction of a refractive-index distribution (RID) in optical refraction tomography (ORT) with optical path-length difference (OPD) data is solved using two adaptive-estimation-based extended-Kalman-filter (EKF) approaches. First, a basic single-resolution EKF (SR-EKF) is applied to a state variable model describing the tomographic process, to estimate the RID of an optically transparent refracting object from noisy OPD data. The initialization of the biases and covariances corresponding to the state and measurement noise is discussed. The state and measurement noise biases and covariances are adaptively estimated. An EKF is then applied to the wavelet-transformed state variable model to yield a wavelet-based multiresolution EKF (MR-EKF) solution approach. To numerically validate the adaptive EKF approaches, we evaluate them with benchmark studies of standard stationary cases, where comparative results with commonly used efficient deterministic approaches can be obtained. Detailed reconstruction studies for the SR-EKF and two versions of the MR-EKF (with Haar and Daubechies-4 wavelets) compare well with those obtained from a typically used variant of the (deterministic) algebraic reconstruction technique, the average correction per projection method, thus establishing the capability of the EKF for ORT. To the best of our knowledge, the present work contains unique reconstruction studies encompassing the use of EKF for ORT in single-resolution and multiresolution formulations, and also in the use of adaptive estimation of the EKF's noise covariances. (C) 2010 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleation is the first step of a first order phase transition. A new phase is always sprung up in nucleation phenomena. The two main categories of nucleation are homogeneous nucleation, where the new phase is formed in a uniform substance, and heterogeneous nucleation, when nucleation occurs on a pre-existing surface. In this thesis the main attention is paid on heterogeneous nucleation. This thesis wields the nucleation phenomena from two theoretical perspectives: the classical nucleation theory and the statistical mechanical approach. The formulation of the classical nucleation theory relies on equilibrium thermodynamics and use of macroscopically determined quantities to describe the properties of small nuclei, sometimes consisting of just a few molecules. The statistical mechanical approach is based on interactions between single molecules, and does not bear the same assumptions as the classical theory. This work gathers up the present theoretical knowledge of heterogeneous nucleation and utilizes it in computational model studies. A new exact molecular approach on heterogeneous nucleation was introduced and tested by Monte Carlo simulations. The results obtained from the molecular simulations were interpreted by means of the concepts of the classical nucleation theory. Numerical calculations were carried out for a variety of substances nucleating on different substances. The classical theory of heterogeneous nucleation was employed in calculations of one-component nucleation of water on newsprint paper, Teflon and cellulose film, and binary nucleation of water-n-propanol and water-sulphuric acid mixtures on silver nanoparticles. The results were compared with experimental results. The molecular simulation studies involved homogeneous nucleation of argon and heterogeneous nucleation of argon on a planar platinum surface. It was found out that the use of a microscopical contact angle as a fitting parameter in calculations based on the classical theory of heterogeneous nucleation leads to a fair agreement between the theoretical predictions and experimental results. In the presented cases the microscopical angle was found to be always smaller than the contact angle obtained from macroscopical measurements. Furthermore, molecular Monte Carlo simulations revealed that the concept of the geometrical contact parameter in heterogeneous nucleation calculations can work surprisingly well even for very small clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the possibility of a direct second-order transition out of a collinear Neel phase to a paramagnetic spin liquid in two-dimensional quantum antiferromagnets. Contrary to conventional wisdom, we show that such second-order quantum transitions can potentially occur to certain spin liquid states popular in theories of the cuprates. We provide a theory of this transition and study its universal properties in an epsilon expansion. The existence of such a transition has a number of interesting implications for spin-liquid-based approaches to the underdoped cuprates. In particular it considerably clarifies existing ideas for incorporating antiferromagnetic long range order into such a spin-liquid-based approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of electrophoretically homogeneous rabbit penicillin carrier receptor protein (CRP) by affinity chromatography afforded an idealin vitro system to calculate the thermodynamic parameters of binding of penicillin and analogues with CRP as well as competitive binding of such analogues with CRP in presence of14C-penicillin G. The kinetics of association of CRP with 7-deoxy penicillin which does not bind covalently with CRP have been studied through equilibrium dialysis with14C-7-deoxybenzyl penicillin and found to be K=2·79×106M−1.−ΔG=8·106 k cal/mole as well as fluorescence quenching studies with exciter λ 280 K=3·573×106M−1,−ΔG=8·239 k cal/mole. The fluorescence quenching studies have been extended to CRP-benzyl penicillin and CRP-6-aminopenicillanic acid (6APA) systems also. The fluorescence data with benzyl penicillin indicate two conformational changes in CRP—a fast change corresponding to the non-covalent binding to CRP with 7-deoxy penicillin and a slower change due to covalent bond formation. With 6-APA the first change is not observed but the conformational change corresponding to covalent binding is only seen. Competitive binding studies indicate that the order of binding of CRP with the analogues of penicillin is as follows: methicillin > 6APA > carbenicillin >o-nitrobenzyl penicillin > cloxacillin ≈ benzyl penicillin ≈ 6-phenyl acetamido penicillanyl alcohol ≈ 7 phenyl acetamido desacetoxy cephalosporanic acid ≈p-amino benzyl penicillin ≈p-nitro benzyl penicillin > ticarcillin >o-amino benzyl penicillin > amoxycillin > 7-deoxy benzyl penicillin > ampicillin.From these data it has been possible to delineate partially the topology of the penicillin binding cleft of the CRP as well as some of the functional groups in the cleft responsible for the binding process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutte (1979) proved that the disconnected spanning subgraphs of a graph can be reconstructed from its vertex deck. This result is used to prove that if we can reconstruct a set of connected graphs from the shuffled edge deck (SED) then the vertex reconstruction conjecture is true. It is proved that a set of connected graphs can be reconstructed from the SED when all the graphs in the set are claw-free or all are P-4-free. Such a problem is also solved for a large subclass of the class of chordal graphs. This subclass contains maximal outerplanar graphs. Finally, two new conjectures, which imply the edge reconstruction conjecture, are presented. Conjecture 1 demands a construction of a stronger k-edge hypomorphism (to be defined later) from the edge hypomorphism. It is well known that the Nash-Williams' theorem applies to a variety of structures. To prove Conjecture 2, we need to incorporate more graph theoretic information in the Nash-Williams' theorem.