807 resultados para Decisions and criterion
Resumo:
Background: In 1992, Frisch et al (Psychol Assess. 1992;4:92- 10 1) developed the Quality of Life Inventory (QOLI) to measure the concept of quality of life (QOL) because it has long been thought to be related to both physical and emotional well-being. However, the psychometric properties of the QOLI in clinical populations are still in debate. The present study examined the factor structure of QOLI and reported its validity and reliability in a clinical sample. Method: Two hundred seventeen patients with anxiety and depressive disorders completed the QOLI and additional questionnaires measuring symptoms (Zung Self-rating Depression Scale, Beck Anxiety Inventory, Fear Questionnaire, Depression Anxiety Stress Scale-Stress) and subjective well-being (Satisfaction With Life Scale) were also used. Results: Exploratory factor analysis via the principal components method, with oblique rotation, revealed a 2-factor structure that accounted for 42.73% of the total variance, and a subsequent confirmatory factor analysis suggested a moderate fit of the data to this model. The 2 factors appeared to describe self-oriented QOL and externally oriented QOL. The Cronbach alpha coefficients were 0.85 for the overall QOLI score, 0.81 for the first factor, and 0.75 for the second factor. Conclusion: Consistent evidence was also found to support the concurrent, discriminant, predictive, and criterion-related validity of the QOLI. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
The Systems Engineering Group (SEG) at De Montfort University are developing the Boardman Soft Systems Methodology (BSSM) which allows complex human systems to be modelled, this work builds upon Checkland's Soft Systems Methodology (1981). The BSSM has been applied to the modelling of the systems engineering process as used in design and manufacturing companies. The BSSM is used to solicit information from a company and this data is then transformed into systemic diagrams (systemigrams). These systemigrams are posited to be accurate and concise representations of the system which has been modelled. This paper describes the collaboration between SEG and a manufacturing company (MC) in Leicester, England. The purpose of this collaboration was twofold. First, it was to create an objective view of the MC's processes, in the form of systemigrams. It was important to get this modelled by a source outside of the company, as it is difficult for people within a system being modelled to be unbiased. Secondly, it allowed a series of systemigrams to be produced which can then be subjected to simulation, for the purpose of aiding risk management decisions and to reduce the project cycle time
Resumo:
This thesis describes the design and engineering of a pressurised biomass gasification test facility. A detailed examination of the major elements within the plant has been undertaken in relation to specification of equipment, evaluation of options and final construction. The retrospective project assessment was developed from consideration of relevant literature and theoretical principles. The literature review includes a discussion on legislation and applicable design codes. From this analysis, each of the necessary equipment units was reviewed and important design decisions and procedures highlighted and explored. Particular emphasis was placed on examination of the stringent demands of the ASME VIII design codes. The inter-relationship of functional units was investigated and areas of deficiency, such as biomass feeders and gas cleaning, have been commented upon. Finally, plant costing was summarized in relation to the plant design and proposed experimental programme. The main conclusion drawn from the study is that pressurised gasification of biomass is far more difficult and expensive to support than atmospheric gasification. A number of recommendations have been made regarding future work in this area.
Resumo:
Although the strategic group and resource based perspectives are frequently presented as mutually exclusive, we argue otherwise. The resource based view informs strategic group analysis through a firm's product or service portfolio by offering a richer perspective on strategy and an additional lens for competitive group interpretation. Products act as the locus and bedrock for corporate decisions and form the backbone upon which market strategies are constructed. A "corporate genome" analogy is presented to illustrate how this process occurs within the U.K. pharmaceutical industry. © 2005 Elsevier Ltd. All rights reserved.
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
In this chapter we track emerging issues in public participation and involvement in European policymaking. We focus on the politics, legitimacy and accountability of different actors as well as exploring how European participation processes relate to globalization in general and global and regional governance in particular. Health policies tend to be understood as national or even local, yet they are often shaped and defined by regulatory decisions and policies that are determined globally and regionally.
Resumo:
Purpose: (1) To devise a model-based method for estimating the probabilities of binocular fusion, interocular suppression and diplopia from psychophysical judgements, (2) To map out the way fusion, suppression and diplopia vary with binocular disparity and blur of single edges shown to each eye, (3) To compare the binocular interactions found for edges of the same vs opposite contrast polarity. Methods: Test images were single, horizontal, Gaussian-blurred edges, with blur B = 1-32 min arc, and vertical disparity 0-8.B, shown for 200 ms. In the main experiment, observers reported whether they saw one central edge, one offset edge, or two edges. We argue that the relation between these three response categories and the three perceptual states (fusion, suppression, diplopia) is indirect and likely to be distorted by positional noise and criterion effects, and so we developed a descriptive, probabilistic model to estimate both the perceptual states and the noise/criterion parameters from the data. Results: (1) Using simulated data, we validated the model-based method by showing that it recovered fairly accurately the disparity ranges for fusion and suppression, (2) The disparity range for fusion (Panum's limit) increased greatly with blur, in line with previous studies. The disparity range for suppression was similar to the fusion limit at large blurs, but two or three times the fusion limit at small blurs. This meant that diplopia was much more prevalent at larger blurs, (3) Diplopia was much more frequent when the two edges had opposite contrast polarity. A formal comparison of models indicated that fusion occurs for same, but not opposite, polarities. Probability of suppression was greater for unequal contrasts, and it was always the lower-contrast edge that was suppressed. Conclusions: Our model-based data analysis offers a useful tool for probing binocular fusion and suppression psychophysically. The disparity range for fusion increased with edge blur but fell short of complete scale-invariance. The disparity range for suppression also increased with blur but was not close to scale-invariance. Single vision occurs through fusion, but also beyond the fusion range, through suppression. Thus suppression can serve as a mechanism for extending single vision to larger disparities, but mainly for sharper edges where the fusion range is small (5-10 min arc). For large blurs the fusion range is so much larger that no such extension may be needed. © 2014 The College of Optometrists.
Resumo:
Intercultural communication in the global environment frequently involves recourse to translation. This generates new phenomena which, in turn, raise new questions for translation theory and practice. This issue is concerned with the concept of the hybrid text as one of these phenomena. In this introductory chapter, a hybrid text is defined as: „a text that results from a translation process. It shows features that somehow seem ‘out of place'/‘strange'/‘unusual' for the receiving culture, i.e. the target culture”. It is important, however, to differentiate between the true hybrid, which is the result of positive authorial and/or translatorial decisions, and the inadequate text which exhibits features of translationese, resulting from a lack of competence. Textual, contextual and social features of hybrid texts are postulated (see discussion paper). These are the object of critical reflection in sub-sequent chapters, in relation to different genres. The potential of the hybrid text for translation research is explored.
Resumo:
We analyze a business model for e-supermarkets to enable multi-product sourcing capacity through co-opetition (collaborative competition). The logistics aspect of our approach is to design and execute a network system where “premium” goods are acquired from vendors at multiple locations in the supply network and delivered to customers. Our specific goals are to: (i) investigate the role of premium product offerings in creating critical mass and profit; (ii) develop a model for the multiple-pickup single-delivery vehicle routing problem in the presence of multiple vendors; and (iii) propose a hybrid solution approach. To solve the problem introduced in this paper, we develop a hybrid metaheuristic approach that uses a Genetic Algorithm for vendor selection and allocation, and a modified savings algorithm for the capacitated VRP with multiple pickup, single delivery and time windows (CVRPMPDTW). The proposed Genetic Algorithm guides the search for optimal vendor pickup location decisions, and for each generated solution in the genetic population, a corresponding CVRPMPDTW is solved using the savings algorithm. We validate our solution approach against published VRPTW solutions and also test our algorithm with Solomon instances modified for CVRPMPDTW.
Resumo:
Background: Major Depressive Disorder (MDD) is among the most prevalent and disabling medical conditions worldwide. Identification of clinical and biological markers ("biomarkers") of treatment response could personalize clinical decisions and lead to better outcomes. This paper describes the aims, design, and methods of a discovery study of biomarkers in antidepressant treatment response, conducted by the Canadian Biomarker Integration Network in Depression (CAN-BIND). The CAN-BIND research program investigates and identifies biomarkers that help to predict outcomes in patients with MDD treated with antidepressant medication. The primary objective of this initial study (known as CAN-BIND-1) is to identify individual and integrated neuroimaging, electrophysiological, molecular, and clinical predictors of response to sequential antidepressant monotherapy and adjunctive therapy in MDD. Methods: CAN-BIND-1 is a multisite initiative involving 6 academic health centres working collaboratively with other universities and research centres. In the 16-week protocol, patients with MDD are treated with a first-line antidepressant (escitalopram 10-20 mg/d) that, if clinically warranted after eight weeks, is augmented with an evidence-based, add-on medication (aripiprazole 2-10 mg/d). Comprehensive datasets are obtained using clinical rating scales; behavioural, dimensional, and functioning/quality of life measures; neurocognitive testing; genomic, genetic, and proteomic profiling from blood samples; combined structural and functional magnetic resonance imaging; and electroencephalography. De-identified data from all sites are aggregated within a secure neuroinformatics platform for data integration, management, storage, and analyses. Statistical analyses will include multivariate and machine-learning techniques to identify predictors, moderators, and mediators of treatment response. Discussion: From June 2013 to February 2015, a cohort of 134 participants (85 outpatients with MDD and 49 healthy participants) has been evaluated at baseline. The clinical characteristics of this cohort are similar to other studies of MDD. Recruitment at all sites is ongoing to a target sample of 290 participants. CAN-BIND will identify biomarkers of treatment response in MDD through extensive clinical, molecular, and imaging assessments, in order to improve treatment practice and clinical outcomes. It will also create an innovative, robust platform and database for future research. Trial registration: ClinicalTrials.gov identifier NCT01655706. Registered July 27, 2012.
Resumo:
Background and objective: Safe prescribing requires accurate and practical information about drugs. Our objective was to measure the utility of current sources of prescribing guidance when used to inform practical prescribing decisions, and to compare current sources of prescribing guidance in the UK with idealized prescribing guidance. Methods: We developed 25 clinical scenarios. Two independent assessors rated and ranked the performance of five common sources of prescribing guidance in the UK when used to answer the clinical scenarios. A third adjudicator facilitated review of any disparities. An idealized list of contents for prescribing guidance was developed and sent for comments to academics and users of prescribing guidance. Following consultation an operational check was used to assess compliance with the idealized criteria. The main outcome measures were relative utility in answering the clinical scenarios and compliance with the idealized prescribing guidance. Results: Current sources of prescribing guidance used in the UK differ in their utility, when measured using clinical scenarios. The British National Formulary (BNF) and EMIS LV were the best performing sources in terms of both ranking [mean rank 1·24 and 2·20] and rating [%excellent or adequate 100% and 72%]. Current sources differed in the extent to which they fulfilled criteria for ideal prescribing guidance, but the BNF, and EMIS LV to a lesser extent, closely matched the criteria. Discussion: We have demonstrated how clinical scenarios can be used to assess prescribing guidance resources. Producers of prescribing guidance documents should consider our idealized template. Prescribers require high-quality information to support their practice. Conclusion: Our test was helpful in distinguishing between prescribing resources. Producers of prescribing guidance should consider the utility of their products to end-users, particularly in those more complex areas where prescribers may need most support. Existing UK prescribing guidance resources differ in their ability to provide assistance to prescribers. © 2010 Blackwell Publishing Ltd.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
Az utóbbi évtizedek nemzetközi trendjei azt mutatják, hogy a civil szervezetek és a nonprofit szolgáltatók számottevő hatást gyakorolnak a versenyképesség alakulására. Ez a tanulmány azokat a formális és informális mechanizmusokat tekinti át, amelyeken keresztül a civil társadalom befolyásolja a közintézményi döntéseket és azok gyakorlati megvalósítását, hozzájárul a „government”-tôl a „governance” irányába való elmozduláshoz. Szintén képet ad arról az átalakulási folyamatról, amely a közszolgáltatások területén zajlik, s amelyből egyre markánsabban rajzolódik ki a közösségi kezdeményezésen alapuló, társadalmi ellenőrzés alatt működő nonprofit szolgáltatók és az állami szereplők közötti partneri viszony kialakulásának tendenciája. ____________ The international trends of the last decades have revealed that civil society organisations and nonprofit service providers have a significant impact on competitiveness. This paper gives an overview of the formal and informal mechanisms operated by civil society in order to keep public administration accountable, to influence public decisions and their implementation, thus moving from “government” towards “governance”. It also analyses the transition of public services, the more and more noticeable signs of an emerging partnership between the grassroots, community controlled service providing nonprofit organisations and the government actors.
Resumo:
A tanulmány azt vizsgálja, hogy a különböző kamatlábaknak milyen hatásai vannak az árszintre, illetve a nominális árakra egy nyitott elsősorban kis, nyitott gazdaságban szabad tőkeáramlás mellett. Míg a zárt gazdaságban csupán a nominális és reálkamatláb megkülönböztetése a lényeges, nyitott gazdaságban a kamatlábak vizsgálatakor meg kell fontolnunk a kamatlábparitás kérdését is. Tisztáznunk kell a reálkamatláb összetevőit, amelyben fontos szerepet kap mind az árfolyam-begyűrűzés (pass-through), mind pedig a kockázati prémium mértéke. A kamatlábhatások vizsgálatakor először azt a mechanizmust elemezzük, amely által a kamatláb befolyásolja a tartós jószágok költségét (explicit vagy implicit bérleti díját). Másodszor az exportszektor termelési döntése és a hazai kamatláb viszonyára vonatkozó mechanizmust vizsgáljuk. Belátjuk, hogy az exportáló szektor döntései függetlenek lehetnek a belföldi kamatlábaktól. Harmadszor bizonyos árazási viselkedéseket tanulmányozunk. Bebizonyítjuk, hogy a kamatláb olyan növelése, ami nem változtat a jelenlegi árfolyamon, árszintnövelő az importőr ország számára. Megfogalmazható az a nézet, hogy ha van is a kamatlábaknak keresleti hatása a zárt gazdaságban, a kis, nyitott gazdaságban ez vélhetőleg sokkal gyengébb. _____ The study examines what effects various interest rates have on the price level and nomi-nal prices in an open (primarily small) economy with free flows of capital. A closed economy calls for a distinction only between nominal and real rates of interest, but in an open economy, questions of interest-rate parity have to be considered as well. It is nec-essary to clarify the factors behind the real interest rate important for price-level pass-through and for the scale of risk premium. Analysis of interest-rate effects begins with the mechanism whereby the interest rate influences the cost of fixed assets (explicit or implicit rents). Secondly, the mechanism behind the relation of export-sector production decisions and domestic interest rates is examined. It emerges that decisions of the export sector are independent of domestic interest rates. Thirdly, certain types of pricing behav-iour are studied. It is shown that a rise in the interest rate that does not alter the present exchange rate is a price-raising factor for the importing country. It can be assumed that if the interest rate has a demand effect in a closed economy, this will presumably be much weaker in a small open economy.