974 resultados para analogy calculation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper takes its root in a trivial observation: management approaches are unable to provide relevant guidelines to cope with uncertainty, and trust of our modern worlds. Thus, managers are looking for reducing uncertainty through information’s supported decision-making, sustained by ex-ante rationalization. They strive to achieve best possible solution, stability, predictability, and control of “future”. Hence, they turn to a plethora of “prescriptive panaceas”, and “management fads” to bring simple solutions through best practices. However, these solutions are ineffective. They address only one part of a system (e.g. an organization) instead of the whole. They miss the interactions and interdependencies with other parts leading to “suboptimization”. Further classical cause-effects investigations and researches are not very helpful to this regard. Where do we go from there? In this conversation, we want to challenge the assumptions supporting the traditional management approaches and shed some lights on the problem of management discourse fad using the concept of maturity and maturity models in the context of temporary organizations as support for reflexion. Global economy is characterized by use and development of standards and compliance to standards as a practice is said to enable better decision-making by managers in uncertainty, control complexity, and higher performance. Amongst the plethora of standards, organizational maturity and maturity models hold a specific place due to general belief in organizational performance as dependent variable of (business) processes continuous improvement, grounded on a kind of evolutionary metaphor. Our intention is neither to offer a new “evidence based management fad” for practitioners, nor to suggest research gap to scholars. Rather, we want to open an assumption-challenging conversation with regards to main stream approaches (neo-classical economics and organization theory), turning “our eyes away from the blinding light of eternal certitude towards the refracted world of turbid finitude” (Long, 2002, p. 44) generating what Bernstein has named “Cartesian Anxiety” (Bernstein, 1983, p. 18), and revisit the conceptualization of maturity and maturity models. We rely on conventions theory and a systemic-discursive perspective. These two lenses have both information & communication and self-producing systems as common threads. Furthermore the narrative approach is well suited to explore complex way of thinking about organizational phenomena as complex systems. This approach is relevant with our object of curiosity, i.e. the concept of maturity and maturity models, as maturity models (as standards) are discourses and systems of regulations. The main contribution of this conversation is that we suggest moving from a neo-classical “theory of the game” aiming at making the complex world simpler in playing the game, to a “theory of the rules of the game”, aiming at influencing and challenging the rules of the game constitutive of maturity models – conventions, governing systems – making compatible individual calculation and social context, and possible the coordination of relationships and cooperation between agents with or potentially divergent interests and values. A second contribution is the reconceptualization of maturity as structural coupling between conventions, rather than as an independent variable leading to organizational performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To describe the recruitment strategy and association between facility and staff characteristics and success of resident recruitment for the Promoting Independence in Residential Care (PIRC) trial. DESIGN: Cross-sectional study of staff and facility characteristics and recruitment rates within facilities with calculation of cluster effects of multiple measures. SETTING AND PARTICIPANTS: Staff of low-level dependency residential care facilities and residents able to engage in a physical activity program in 2 cities in New Zealand. MEASURES: A global impression of staff willingness to facilitate research was gauged by research nurses, facility characteristics were measured by staff interview. Relevant outcomes were measured by resident interview and included the following: (1) Function: Late Life FDI scale, timed-up-and-go, FICSIT balance scale and the Elderly Mobility Scale; (2) Quality of Life: EuroQol quality of life scale, Life Satisfaction Index; and (3) falls were assessed by audit of the medical record. Correlation between recruitment rates, facility characteristics and global impression of staff willingness to participate were investigated. Design effects were calculated on outcomes. RESULTS: Forty-one (85%) facilities and 682 (83%) residents participated, median age was 85 years (range 65-101), and 74% were women. Participants had complex health problems. Recruitment rates were associated (but did not increase linearly) with the perceived willingness of staff, and were not associated with facility size. Design effects from the cluster recruitment differed according to outcome. CONCLUSIONS: The recruitment strategy was successful in recruiting a large sample of people with complex comorbidities and high levels of functional disability despite perceptions of staff reluctance. Staff willingness was related to recruitment success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research background For almost 80 years the Chuck Taylor (or Chuck T's) All Star basketball shoe has been an iconic item of fashion apparel. The Chuck T's were first designed in 1921 by Converse, an American shoe company and over the decades they became a popular item not purely for sports and athletic purposes but rather evolved into the shoe of choice for many subcultural groups as a fashion item. In some circles the Chuck Taylor is still seen as the "coolest" sneaker of all time - one which will never go out of fashion regardless of changing trends. With over 600 millions pairs sold all over the world since its release, the Converse shoe is representative of not only a fashion culture - but also of a consumption culture - that evolved as the driving force behind the massive growth of the Western economic system during the 20th Century. Artisan Gallery (Brisbane), in conjunction with the exhibition Reboot: Function, Fashion and the Sneaker, a history of the sneaker, selected 20 designers to customise and re-design the classic Converse Chuck Taylor All Stars shoe and in doing so highlighted the diversity of forms possible for creative outcomes. As Artisan Gallery Curator Kirsten Fitzpatrick states “We were expecting people to draw and paint on them. Instead, we had shoes... mounted as trophies.." referring to the presentation of "Converse Consumption". The exhibition ran from 21 June – 16 August 2012: Research question The Chuck T’s is one of many overwhelmingly commercially successful designs of the last century. Nowadays we are faced with the significant problems of overconsumption and the stress this causes on the natural ecosystem; and on people as a result. As an active member of the industrial design fraternity – a discipline that sits at the core of this problem - how can I use this opportunity to comment on the significant issue of consumption? An effective way to do this was to associate consumption of goods with consumption of sugar. There are significant similarities between our ceaseless desires to consume products and our fervent need to consume indulgent sweet foods. Artisan Statement Delicious, scrumptious, delectable... your pupils dilate, your blood pressure spikes, your liver goes into overdrive. Immediately, your brain cuts off the adenosine receptors, preventing drowsiness. Your body increases dopamine production, in-turn stimulating the pleasure receptors in your brain. Your body absorbs all the sweetness and turns it into fat – while all the nutrients that you actually require are starting to be destroyed, about to be expelled. And this is only after one bite! After some time though, your body comes crashing back to earth. You become irritable and begin to feel sluggish. Your eyelids seem heavy while your breathing pattern changes. Your body has consumed all the energy and destroyed all available nutrients. You literally begin to shut down. These are the physiological effects of sugar consumption. A perfect analogy for our modern day consumer driven world. Enjoy your dessert! Research contribution “Converse Consumption” contributes to the conversation regarding over-consumption by compelling people to reflect on their consumption behaviour through the reconceptualising of the deconstructed Chuck T’s in an attractive edible form. By doing so the viewer has to deal with the desire to consume the indulgent looking dessert with the contradictory fact that it is comprised of a pair of shoes. The fact that the shoes are Chuck T’s make the effect even more powerful due to their iconic status. These clashing motivations are what make “Converse Consumption” a bizarre yet memorable experience. Significance The exhibition was viewed by an excess of 1000 people and generated exceptional media coverage and public exposure/impact. As Artisan Gallery Curator Kirsten Fitzpatrick states “20 of Brisbane's best designers were given the opportunity to customise their own Converse Sneakers, with The Converse Blank Canvas Project.” And to be selected in this category demonstrates the calibre of importance for design prominence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The early warning based on real-time prediction of rain-induced instability of natural residual slopes helps to minimise human casualties due to such slope failures. Slope instability prediction is complicated, as it is influenced by many factors, including soil properties, soil behaviour, slope geometry, and the location and size of deep cracks in the slope. These deep cracks can facilitate rainwater infiltration into the deep soil layers and reduce the unsaturated shear strength of residual soil. Subsequently, it can form a slip surface, triggering a landslide even in partially saturated soil slopes. Although past research has shown the effects of surface-cracks on soil stability, research examining the influence of deep-cracks on soil stability is very limited. This study aimed to develop methodologies for predicting the real-time rain-induced instability of natural residual soil slopes with deep cracks. The results can be used to warn against potential rain-induced slope failures. The literature review conducted on rain induced slope instability of unsaturated residual soil associated with soil crack, reveals that only limited studies have been done in the following areas related to this topic: - Methods for detecting deep cracks in residual soil slopes. - Practical application of unsaturated soil theory in slope stability analysis. - Mechanistic methods for real-time prediction of rain induced residual soil slope instability in critical slopes with deep cracks. Two natural residual soil slopes at Jombok Village, Ngantang City, Indonesia, which are located near a residential area, were investigated to obtain the parameters required for the stability analysis of the slope. A survey first identified all related field geometrical information including slope, roads, rivers, buildings, and boundaries of the slope. Second, the electrical resistivity tomography (ERT) method was used on the slope to identify the location and geometrical characteristics of deep cracks. The two ERT array models employed in this research are: Dipole-dipole and Azimuthal. Next, bore-hole tests were conducted at different locations in the slope to identify soil layers and to collect undisturbed soil samples for laboratory measurement of the soil parameters required for the stability analysis. At the same bore hole locations, Standard Penetration Test (SPT) was undertaken. Undisturbed soil samples taken from the bore-holes were tested in a laboratory to determine the variation of the following soil properties with the depth: - Classification and physical properties such as grain size distribution, atterberg limits, water content, dry density and specific gravity. - Saturated and unsaturated shear strength properties using direct shear apparatus. - Soil water characteristic curves (SWCC) using filter paper method. - Saturated hydraulic conductivity. The following three methods were used to detect and simulate the location and orientation of cracks in the investigated slope: (1) The electrical resistivity distribution of sub-soil obtained from ERT. (2) The profile of classification and physical properties of the soil, based on laboratory testing of soil samples collected from bore-holes and visual observations of the cracks on the slope surface. (3) The results of stress distribution obtained from 2D dynamic analysis of the slope using QUAKE/W software, together with the laboratory measured soil parameters and earthquake records of the area. It was assumed that the deep crack in the slope under investigation was generated by earthquakes. A good agreement was obtained when comparing the location and the orientation of the cracks detected by Method-1 and Method-2. However, the simulated cracks in Method-3 were not in good agreement with the output of Method-1 and Method-2. This may have been due to the material properties used and the assumptions made, for the analysis. From Method-1 and Method-2, it can be concluded that the ERT method can be used to detect the location and orientation of a crack in a soil slope, when the ERT is conducted in very dry or very wet soil conditions. In this study, the cracks detected by the ERT were used for stability analysis of the slope. The stability of the slope was determined using the factor of safety (FOS) of a critical slip surface obtained by SLOPE/W using the limit equilibrium method. Pore-water pressure values for the stability analysis were obtained by coupling the transient seepage analysis of the slope using finite element based software, called SEEP/W. A parametric study conducted on the stability of an investigated slope revealed that the existence of deep cracks and their location in the soil slope are critical for its stability. The following two steps are proposed to predict the rain-induced instability of a residual soil slope with cracks. (a) Step-1: The transient stability analysis of the slope is conducted from the date of the investigation (initial conditions are based on the investigation) to the preferred date (current date), using measured rainfall data. Then, the stability analyses are continued for the next 12 months using the predicted annual rainfall that will be based on the previous five years rainfall data for the area. (b) Step-2: The stability of the slope is calculated in real-time using real-time measured rainfall. In this calculation, rainfall is predicted for the next hour or 24 hours and the stability of the slope is calculated one hour or 24 hours in advance using real time rainfall data. If Step-1 analysis shows critical stability for the forthcoming year, it is recommended that Step-2 be used for more accurate warning against the future failure of the slope. In this research, the results of the application of the Step-1 on an investigated slope (Slope-1) showed that its stability was not approaching a critical value for year 2012 (until 31st December 2012) and therefore, the application of Step-2 was not necessary for the year 2012. A case study (Slope-2) was used to verify the applicability of the complete proposed predictive method. A landslide event at Slope-2 occurred on 31st October 2010. The transient seepage and stability analyses of the slope using data obtained from field tests such as Bore-hole, SPT, ERT and Laboratory tests, were conducted on 12th June 2010 following the Step-1 and found that the slope in critical condition on that current date. It was then showing that the application of the Step-2 could have predicted this failure by giving sufficient warning time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shaft fracture at an early stage of operation is a common problem for a certain type of wind turbine. To determine the cause of shaft failure a series of experimental tests were conducted to evaluate the chemical composition and mechanical properties. A detail analysis involving macroscopic feature and microstructure analysis of the material of the shaft was also performed to have an in depth knowledge of the cause of fracture. The experimental tests and analysis results show that there are no significant differences in the material property of the main shaft when comparing it with the Standard, EN10083-3:2006. The results show that stress concentration on the shaft surface close to the critical section of the shaft due to rubbing of the annular ring and coupled with high stress concentration caused by the change of inner diameter of the main shaft are the main reasons that result in fracture of the main shaft. In addition, inhomogeneity of the main shaft micro-structure also accelerates up the fracture process of the main shaft. In addition, the theoretical calculation of equivalent stress at the end of the shaft was performed, which demonstrate that cracks can easily occur under the action of impact loads. The contribution of this paper is to provide a reference in fracture analysis of similar main shaft of wind turbines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Condensation technique of degree of freedom is firstly proposed to improve the computational efficiency of meshfree method with Galerkin weak form. In present method, scattered nodes without connectivity are divided into several subsets by cells with arbitrary shape. The local discrete equations are established over each cell by using moving kriging interpolation, in which the nodes that located in the cell are used for approximation. Then, the condensation technique can be introduced into the local discrete equations by transferring equations of inner nodes to equations of boundary nodes based on cell. In the scheme of present method, the calculation of each cell is carried out by meshfree method with Galerkin weak form, and local search is implemented in interpolation. Numerical examples show that the present method has high computational efficiency and convergence, and good accuracy is also obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Hydroxyurea (HU), an inhibitor of ribonucleotide reductase, may potentiate the activity of 5-fluorouracil (5-FU) and folinic acid (FA) by reducing the deoxyribonucleotide pool available for DNA synthesis and repair. However as HU may inhibit the formation of 5-fluoro-2-deoxyuridine-5- monophosphate (FdUMP), one of the principal active metabolites of 5-FU, the scheduling of HU may be critical. In vitro experiments suggest that administration of HU following 5-FU, maintaining the concentration in the region of I mM for six or more hours, significantly enhances the efficacy of 5-FU. Patients and methods: 5-FU/FA was given as follows: days 1 and 2 - FA 250 mg/m 2 (max. 350 mg) over two hours followed by 5-FU 400 mg/m 2 by intravenous bolus (ivb) over 15 minutes and subsequently 5-FU 400 mg/m 2 infusion (ivi) over 22 hours. HU was administered on day 3 immediately after the 5-FU with 3 g ivb over 15 minutes followed by 12 g ivi over 12 hours. Results: Thirty patients were entered into the study. Median survival was nine months (range 1-51 + months). There were eight partial responses (28%, 95% CI: 13%-47%). The median duration of response was 6.5 (range 4-9 months). Grade 3-4 toxicities included neutropenia (grade 3 in eight patients and grade 4 in five), anaemia (grade 3 in one patient) and diarrhoea (grade 3 in two patients). Neutropenia was associated with pyrexia in two patients. Phlebitis at the infusion site occurred in five patients. The treatment was complicated by pulmonary embolism in one patient and deep venous thrombosis in another. Conclusion: HU administered in this schedule is well tolerated. Based on these results and those of other phase II studies, a randomised phase III study of 5-FU, FA and HU versus 5-FU and FA using the standard de Gramont schedule is recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stations on Bus Rapid Transit (BRT) lines ordinarily control line capacity because they act as bottlenecks. At stations with passing lanes, congestion may occur when buses maneuvering into and out of the platform stopping lane interfere with bus flow, or when a queue of buses forms upstream of the station blocking inflow. We contend that, as bus inflow to the station area approaches capacity, queuing will become excessive in a manner similar to operation of a minor movement on an unsignalized intersection. This analogy is used to treat BRT station operation and to analyze the relationship between station queuing and capacity. In the first of three stages, we conducted microscopic simulation modeling to study and analyze operating characteristics of the station under near steady state conditions through output variables of capacity, degree of saturation and queuing. A mathematical model was then developed to estimate the relationship between average queue and degree of saturation and calibrated for a specified range of controlled scenarios of mean and coefficient of variation of dwell time. Finally, simulation results were calibrated and validated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a new method of indexing and searching large binary signature collections to efficiently find similar signatures, addressing the scalability problem in signature search. Signatures offer efficient computation with acceptable measure of similarity in numerous applications. However, performing a complete search with a given search argument (a signature) requires a Hamming distance calculation against every signature in the collection. This quickly becomes excessive when dealing with large collections, presenting issues of scalability that limit their applicability. Our method efficiently finds similar signatures in very large collections, trading memory use and precision for greatly improved search speed. Experimental results demonstrate that our approach is capable of finding a set of nearest signatures to a given search argument with a high degree of speed and fidelity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are currently more than 400 cities operating bike share programs. Purported benefits of bike share programs include flexible mobility, physical activity, reduced congestion, emissions and fuel use. Implicit or explicit in the calculation of program benefits are assumptions regarding the modes of travel replaced by bike share journeys. This paper examines the degree to which car trips are replaced by bike share, through an examination of survey and trip data from bike share programs in Melbourne, Brisbane, Washing, D.C., London, and Minneapolis/St. Paul. A secondary and unique component of this analysis examines motor vehicle support services required for bike share fleet rebalancing and maintenance. These two components are then combined to estimate bike share’s overall contribution to changes in vehicle kilometres traveled. The results indicate that the estimated mean reduction in car use due to bike share is at least twice the distance covered by operator support vehicles, with the exception of London, in which the relationship is reversed, largely due to a low car mode substitution rate. As bike share programs mature, evaluation of their effectiveness in reducing car use may become increasingly important. This paper reveals that by increasing the convenience of bike share relative to car use and by improving perceptions of safety, the capacity of bike share programs to reduce vehicle trips and yield overall net benefits will be enhanced. Researchers can adapt the analytical approach proposed in this paper to assist in the evaluation of current and future bike share programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Summary Emergency Departments (EDs) locally, nationally and internationally are becoming increasingly busy. Within this context, it can be challenging to deliver a health service that is safe, of high quality and cost-effective. Whilst various models are described within the literature that aim to measure ED ‘work’ or ‘activity’, they are often not linked to a measure of costs to provide such activity. It is important for hospital and ED managers to understand and apply this link so that optimal staffing and financial resourcing can be justifiably sought. This research is timely given that Australia has moved towards a national Activity Based Funding (ABF) model for ED activity. ABF is believed to increase transparency of care and fairness (i.e. equal work receives equal pay). ABF involves a person-, performance- or activity-based payment system, and thus a move away from historical “block payment” models that do not incentivise efficiency and quality. The aim of the Statewide Workforce and Activity-Based Funding Modelling Project in Queensland Emergency Departments (SWAMPED) is to identify and describe best practice Emergency Department (ED) workforce models within the current context of ED funding that operates under an ABF model. The study is comprised of five distinct phases. This monograph (Phase 1) comprises a systematic review of the literature that was completed in June 2013. The remaining phases include a detailed survey of Queensland hospital EDs’ resource levels, activity and operational models of care, development of new resource models, development of a user-friendly modelling interface for ED mangers, and production of a final report that identifies policy implications. The anticipated deliverable outcome of this research is the development of an ABF based Emergency Workforce Modelling Tool that will enable ED managers to profile both their workforce and operational models of care. Additionally, the tool will assist with the ability to more accurately inform adequate staffing numbers required in the future, inform planning of expected expenditures and be used for standardisation and benchmarking across similar EDs. Summary of the Findings Within the remit of this review of the literature, the main findings include: 1. EDs are becoming busier and more congested Rising demand, barriers to ED throughput and transitions of care all contribute to ED congestion. In addition requests by organisational managers and the community require continued broadening of the scope of services required of the ED and further increases in demand. As the population live longer with more lifestyle diseases their propensity to require ED care continues to grow. 2. Various models of care within EDs exist Models often vary to account for site specific characteritics to suit staffing profile, ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Existing and new models implemented within EDs often depend on the target outcome requiring change. Generally this is focussed on addressing issues at the input, throughput or output areas of the ED. Even with models targeting similar demographic or illness, the structure and process elements underpinning the model can vary, which can impact on outcomes and variance to the patient and carer experience between and within EDs. Major models of care to manage throughput inefficiencies include: A. Workforce Models of Care focus on the appropriate level of staffing for a given workload to provide prompt, timely and clinically effective patient care within an emergency care setting. The studies reviewed suggest that the early involvement of senior medical decision maker and/or specialised nursing roles such as Emergency Nurse Practitioners and Clinical Initiatives Nurse, primary contact or extended scope Allied Health Practitioners can facilitate patient flow and improve key indicators such as length of stay and reducing the number of those who did not wait to be seen amongst others. B. Operational Models of Care within EDs focus on mechanisms for streaming (e.g. fast-tracking) or otherwise grouping patient care based on acuity and complexity to assist with minimising any throughput inefficiencies. While studies support the positive impact of these models in general, it appears that they are most effective when they are adequately resourced. 3. Various methods of measuring ED activity exist Measuring ED activity requires careful consideration of models of care and staffing profile. Measuring activity requires the ability to account for factors including: patient census, acuity, LOS, intensity of intervention, department skill-mix plus an adjustment for non-patient care time. 4. Gaps in the literature Continued ED growth calls for new and innovative care delivery models that are safe, clinically effective and cost effective. New roles and stand-alone service delivery models are often evaluated in isolation without considering the global and economic impact on staffing profiles. Whilst various models of accounting for and measuring health care activity exist, costing studies and cost effectiveness studies are lacking for EDs making accurate and reliable assessments of care models difficult. There is a necessity to further understand, refine and account for measures of ED complexity that define a workload upon which resources and appropriate staffing determinations can be made into the future. There is also a need for continued monitoring and comprehensive evaluation of newly implemented workforce modelling tools. This research acknowledges those gaps and aims to: • Undertake a comprehensive and integrated whole of department workforce profiling exercise relative to resources in the context of ABF. • Inform workforce requirements based on traditional quantitative markers (e.g. volume and acuity) combined with qualitative elements of ED models of care; • Develop a comprehensive and validated workforce calculation tool that can be used to better inform or at least guide workforce requirements in a more transparent manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much of what we currently understand about the structure and energetics of multiply charged anions in the gas phase is derived from the measurement of photoelectron spectra of simple dicarboxylate dianions. Here we have employed a modified linear ion-trap mass spectrometer to undertake complementary investigations of the ionic products resulting from laser-initiated electron photodetachment of two model dianions. Electron photodetachment (ePD) of the \[M-2H](2-) dianions formed from glutaric and adipic acid were found to result in a significant loss of ion signal overall, which is consistent with photoelectron studies that report the emission of slow secondary electrons (Xing et al., 2010 \[201). The ePD mass spectra reveal no signals corresponding to the intact \[M-2H](center dot-) radical anions, but rather \[M-2H-CO2](center dot-) ions are identified as the only abundant ionic products indicating that spontaneous decarboxylation follows ejection of the first electron. Interestingly however, investigations of the structure and energetics of the \[M-2H-CO2](center dot-) photoproducts by ion-molecule reaction and electronic structure calculation indicate that (i) these ions are stable with respect to secondary electron detachment and (ii) most of the ion population retains a distonic radical anion structure where the radical remains localised at the position of the departed carboxylate moiety. These observations lead to the conclusion that the mechanism for loss of ion signal involves unimolecular rearrangement reactions of the nascent \[M-2H](center dot-) carbonyloxyl radical anions that compete favourably with direct decarboxylation. Several possible rearrangement pathways that facilitate electron detachment from the radical anion are identified and are computed to be energetically accessible. Such pathways provide an explanation for prior observations of slow secondary electron features in the photoelectron spectra of the same dicaboxylate dianions. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.