13 resultados para spse model (situation, problem, solution, evaluation)

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to reduce serious health incidents, individuals with high risks need to be identified as early as possible so that effective intervention and preventive care can be provided. This requires regular and efficient assessments of risk within communities that are the first point of contacts for individuals. Clinical Decision Support Systems CDSSs have been developed to help with the task of risk assessment, however such systems and their underpinning classification models are tailored towards those with clinical expertise. Communities where regular risk assessments are required lack such expertise. This paper presents the continuation of GRiST research team efforts to disseminate clinical expertise to communities. Based on our earlier published findings, this paper introduces the framework and skeleton for a data collection and risk classification model that evaluates data redundancy in real-time, detects the risk-informative data and guides the risk assessors towards collecting those data. By doing so, it enables non-experts within the communities to conduct reliable Mental Health risk triage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: A clinical evaluation of the Grand Seiko Auto Ref/Keratometer WAM-5500 (Japan) was performed to evaluate validity and repeatability compared with non-cycloplegic subjective refraction and Javal–Schiotz keratometry. An investigation into the dynamic recording capabilities of the instrument was also conducted. Methods: Refractive error measurements were obtained from 150 eyes of 75 subjects (aged 25.12 ± 9.03 years), subjectively by a masked optometrist, and objectively with the WAM-5500 at a second session. Keratometry measurements from the WAM-5500 were compared to Javal–Schiotz readings. Intratest variability was examined on all subjects, whilst intertest variability was assessed on a subgroup of 44 eyes 7–14 days after the initial objective measures. The accuracy of the dynamic recording mode of the instrument and its tolerance to longitudinal movement was evaluated using a model eye. An additional evaluation of the dynamic mode was performed using a human eye in relaxed and accommodated states. Results: Refractive error determined by the WAM-5500 was found to be very similar (p = 0.77) to subjective refraction (difference, -0.01 ± 0.38 D). The instrument was accurate and reliable over a wide range of refractive errors (-6.38 to +4.88 D). WAM-5500 keratometry values were steeper by approximately 0.05 mm in both the vertical and horizontal meridians. High intertest repeatability was demonstrated for all parameters measured: for sphere, cylinder power and MSE, over 90% of retest values fell within ±0.50 D of initial testing. In dynamic (high-speed) mode, the root-mean-square of the fluctuations was 0.005 ± 0.0005 D and a high level of recording accuracy was maintained when the measurement ring was significantly blurred by longitudinal movement of the instrument head. Conclusion: The WAM-5500 Auto Ref/Keratometer represents a reliable and valid objective refraction tool for general optometric practice, with important additional features allowing pupil size determination and easy conversion into high-speed mode, increasing its usefulness post-surgically following accommodating intra-ocular lens implantation, and as a research tool in the study of accommodation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK Government and large employers have recognised the skills gap between learners leaving the education system and the requirements of employers. The current system is seen to be failing significant numbers of learners and has been accused of schooling but not educating our young people. University-led technical colleges are one part of the solution being developed to provide outstanding engineering education. This paper focusses on the learning experience that the Aston University Engineering Academy, the first University-led University Technical College (UTC), has created for entrants to the Engineering Academy in September 2012, when it opens in brand new buildings next to the University. The overall aim is to produce technically literate young people that have business and enterprise skills as well as insight into the diverse range of opportunities in Engineering and Technical disciplines. The project has brought University staff and students together with employers and Academy staff to optimise the engineering education that they will receive. The innovative model presented has drawn on research from across the world in the implementation of this new type of school, as well as educational practices from the USA and the Scandinavian countries. The resulting curriculum is authentic and exciting and expands the University model of problem-based learning and placements into the secondary school environment. The benefits of this close partnership for University staff and students, the employers and the Academy staff are expanded on and the paper concludes with a prediction of progression routes from the Academy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The communicative practice in the ex-GDR was complex and diverse, although public political discourse had been fairly ritualized. Text-types characteristic of the Communist Party discourse were full of general (superordinate) terms semantic specification was hardly possible (propositional reduction). Changes in the social world result in changes in the communicative practice as well. However, a systematic comparision of text-types across cultures and across ideological boundaries reveals both differences in the textual macro- and superstructures and overlapping as well as universal features, probably related to functional aspects (discourse of power). Six sample texts of the text-type `government declaration', two produced in the ex-GDR, four in the united Germany, are analysed. Special attention is paid to similarities and differences (i) in the textual superstructure (problem-solution schema), (ii) in the concepts that reflect the aims of political actions (simple worlds), (iii) in the agents who (are to) perform these actions (concrete vs abstract agents). Similarities are found mainly in the discursive strategies, e.g. legitimization text actions. Differences become obvious in the strategies used for legitimization, and also in the conceptual domains referred to by the problem-solution schema. The metaphors of construction, path and challenge are of particular interest in this respect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research provides a novel approach for the determination of water content and higher heating value of pyrolysis oil. Pyrolysis oil from Napier grass was used in this study. Water content was determined with pH adjustment using a Karl Fischer titration unit. An equation for actual water in the oil was developed and used, and the results were compared with the traditional Karl Fischer method. The oil was found to have between 42 and 64% moisture under the same pyrolysis condition depending on the properties of the Napier grass prior to the pyrolysis. The higher heating value of the pyrolysis oil was determined using an oil-diesel mixture, and 20 to 25 wt% of the oil in the mixture gave optimum and stable results. A new model was developed for evaluation of higher heating value of dry pyrolysis oil. The dry oil has higher heating values in the range between 19 and 26 MJ/kg. The developed protocols and equations may serve as a reliable alternative means for establishing the actual water content and the higher heating value of pyrolysis oil.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper re-assesses three independently developed approaches that are aimed at solving the problem of zero-weights or non-zero slacks in Data Envelopment Analysis (DEA). The methods are weights restricted, non-radial and extended facet DEA models. Weights restricted DEA models are dual to envelopment DEA models with restrictions on the dual variables (DEA weights) aimed at avoiding zero values for those weights; non-radial DEA models are envelopment models which avoid non-zero slacks in the input-output constraints. Finally, extended facet DEA models recognize that only projections on facets of full dimension correspond to well defined rates of substitution/transformation between all inputs/outputs which in turn correspond to non-zero weights in the multiplier version of the DEA model. We demonstrate how these methods are equivalent, not only in their aim but also in the solutions they yield. In addition, we show that the aforementioned methods modify the production frontier by extending existing facets or creating unobserved facets. Further we propose a new approach that uses weight restrictions to extend existing facets. This approach has some advantages in computational terms, because extended facet models normally make use of mixed integer programming models, which are computationally demanding.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The topic of my research is consumer brand equity (CBE). My thesis is that the success or otherwise of a brand is better viewed from the consumers’ perspective. I specifically focus on consumers as a unique group of stakeholders whose involvement with brands is crucial to the overall success of branding strategy. To this end, this research examines the constellation of ideas on brand equity that have hitherto been offered by various scholars. Through a systematic integration of the concepts and practices identified but these scholars (concepts and practices such as: competitiveness, consumer searching, consumer behaviour, brand image, brand relevance, consumer perceived value, etc.), this research identifies CBE as a construct that is shaped, directed and made valuable by the beliefs, attitudes and the subjective preferences of consumers. This is done by examining the criteria on the basis of which the consumers evaluate brands and make brand purchase decisions. Understanding the criteria by which consumers evaluate brands is crucial for several reasons. First, as the basis upon which consumers select brands changes with consumption norms and technology, understanding the consumer choice process will help in formulating branding strategy. Secondly, an understanding of these criteria will help in formulating a creative and innovative agenda for ‘new brand’ propositions. Thirdly, it will also influence firms’ ability to simulate and mould the plasticity of demand for existing brands. In examining these three issues, this thesis presents a comprehensive account of CBE. This is because the first issue raised in the preceding paragraph deals with the content of CBE. The second issue addresses the problem of how to develop a reliable and valid measuring instrument for CBE. The third issue examines the structural and statistical relationships between the factors of CBE and the consequences of CBE on consumer perceived value (CPV). Using LISREL-SIMPLIS 8.30, the study finds direct and significant influential links between consumer brand equity and consumer value perception.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We analyze a business model for e-supermarkets to enable multi-product sourcing capacity through co-opetition (collaborative competition). The logistics aspect of our approach is to design and execute a network system where “premium” goods are acquired from vendors at multiple locations in the supply network and delivered to customers. Our specific goals are to: (i) investigate the role of premium product offerings in creating critical mass and profit; (ii) develop a model for the multiple-pickup single-delivery vehicle routing problem in the presence of multiple vendors; and (iii) propose a hybrid solution approach. To solve the problem introduced in this paper, we develop a hybrid metaheuristic approach that uses a Genetic Algorithm for vendor selection and allocation, and a modified savings algorithm for the capacitated VRP with multiple pickup, single delivery and time windows (CVRPMPDTW). The proposed Genetic Algorithm guides the search for optimal vendor pickup location decisions, and for each generated solution in the genetic population, a corresponding CVRPMPDTW is solved using the savings algorithm. We validate our solution approach against published VRPTW solutions and also test our algorithm with Solomon instances modified for CVRPMPDTW.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Here we report on a potential catalytic process for efficient clean-up of plastic pollution in waters, such as the Great Pacific Garbage Patch (CPGP). Detailed catalytic mechanisms of RuO2 during supercritical water gasification of common polyolefin plastics including low-density polyethylene (LDPE), high-density polyethylene (HDPE), polypropylene (PP) and polystyrene (PP), have been investigated in a batch reactor at 450 °C, 60 min. All four plastics gave very high carbon gasification efficiencies (CGE) and hydrogen gasification efficiencies (HGE). Methane was the highest gas component, with a yield of up to 37 mol kg−1LDPE using the 20 wt% RuO2 catalyst. Evaluation of the gas yields, CGE and HGE revealed that the conversion of PS involved thermal degradation, steam reforming and methanation; whereas hydrogenolysis was a possible additional mechanism during the conversion of aliphatic plastics. The process has the benefits of producing a clean-pressurized methane-rich fuel gas as well as cleaning up hydrocarbons-polluted waters.