909 resultados para Wine and wine making Analysis
Resumo:
"No. 21, 1916."
Resumo:
Previously published: One hundred home-brewed wines. London : Country Life, 1927.
Resumo:
This report describes the practice of teamwork as expressed in case conferences for care of the elderly and evaluates the effectiveness of case conferences in their contribution to care. The study involved the observation of more than two hundred case conferences in sixteen locations throughout the West Midlands, in which one thousand seven hundred and three participants were involved. Related investigation of service outcomes involved an additional ninety six patients who were interviewed in their homes. The pu`pose of the study was to determine whether the practice of teamwork and decision-making in case conferences is a productive and cost effective method of working. Preliminary exploration revealed the extent to which the team approach is part of the organisational culture and which, it is asserted, serves to perpetuate the mythical value of team working. The study has demonstrated an active subscription to the case conference approach, yet has revealed many weaknesses, not least of which is clear evidence that certain team members are inhibited in their contribution. Further, that the decisional process in case conferences has little consequence to care outcome. Where outcomes are examined there is evidence of service inadequacy. This work presents a challenge to professionals to confront their working practices with honesty and with vision, in the quest for the best and most cost effective service to patients.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
2000 Mathematics Subject Classification: 62-04, 62H30, 62J20
Resumo:
Mennyiben képes jelenleg a közösségi gazdaságtan az adópolitikák nemzetek fölötti centralizációjára vonatkozó politikai döntések megalapozására? Válaszunk röviden az lesz, hogy a közösségi gazdaságtan főárama - noha számos releváns gazdasági és politikai tényező hatását sikeresen elemzi - jelenleg nem kínál kielégítőnek tekinthető döntési kritériumokat a döntéshozók számára. Ennek oka, hogy központi szerepet játszik benne egy, a modellek szempontjából exogén és a közgazdasági elmélettől idegen tényező: a kormányzatok jóindulatára, pontosabban annak mértékére vonatkozó premissza. Tanulmányunk az adóverseny fiskális föderalista elméletét vizsgálja, és megpróbál általánosabb szinten is a közszektor gazdaságelméletének jelenlegi állapotára, valamint továbbfejlesztésére vonatkozó tanulságokat levonni. A kiutat az elméleti zsákutcából a kormányzati működés és döntéshozatal, valamint a kívánatos gazdaságpolitikai döntések elméletének összekapcsolása jelentheti. Erre megtörténtek az első kísérletek, de a szisztematikus és átfogó elemzés egyelőre várat magára. / === / How far can community economics provide a basis for political decision-making on supranational centralization of taxation policies? The short answer here will be that although the mainstream of community economics succeeds in analysing many relevant economic and political factors, it fails at present to provide satisfactory criteria for decisionmakers. This is because a central role is played in it by a factor exogenous to the models and alien to economic theory: the premise of the measure of goodwill from governments. The study examines the fiscal federalist theory of tax competition. It tries to draw conclusions, on a more general level, about the present state of the economic theory of the public sector and future development of it. The way out of the theoretical blind alley could be to link the theories of government operation and decision-making and of desirable economic-policy decision-making. The first attempts to do so have been made, but a systematic and comprehensive analysis is still awaited.
Resumo:
In the article - Menu Analysis: Review and Evaluation - by Lendal H. Kotschevar, Distinguished Professor School of Hospitality Management, Florida International University, Kotschevar’s initial statement reads: “Various methods are used to evaluate menus. Some have quite different approaches and give different information. Even those using quite similar methods vary in the information they give. The author attempts to describe the most frequently used methods and to indicate their value. A correlation calculation is made to see how well certain of these methods agree in the information they give.” There is more than one way to look at the word menu. The culinary selections decided upon by the head chef or owner of a restaurant, which ultimately define the type of restaurant is one way. The physical outline of the food, which a patron actually holds in his or her hand, is another. These descriptions are most common to the word, menu. The author primarily concentrates on the latter description, and uses the act of counting the number of items sold on a menu to measure the popularity of any particular item. This, along with a formula, allows Kotschevar to arrive at a specific value per item. Menu analysis would appear a difficult subject to broach. How does a person approach a menu analysis, how do you qualify and quantify a menu; it seems such a subjective exercise. The author offers methods and outlines on approaching menu analysis from empirical perspectives. “Menus are often examined visually through the evaluation of various factors. It is a subjective method but has the advantage of allowing scrutiny of a wide range of factors which other methods do not,” says Distinguished Professor, Kotschevar. “The method is also highly flexible. Factors can be given a score value and scores summed to give a total for a menu. This allows comparison between menus. If the one making the evaluations knows menu values, it is a good method of judgment,” he further offers. The author wants you to know that assigning values is fundamental to a pragmatic menu analysis; it is how the reviewer keeps score, so to speak. Value merit provides reliable criteria from which to gauge a particular menu item. In the final analysis, menu evaluation provides the mechanism for either keeping or rejecting selected items on a menu. Kotschevar provides at least three different matrix evaluation methods; they are defined as the Miller method, the Smith and Kasavana method, and the Pavesic method. He offers illustrated examples of each via a table format. These are helpful tools since trying to explain the theories behind the tables would be difficult at best. Kotschevar also references examples of analysis methods which aren’t matrix based. The Hayes and Huffman - Goal Value Analysis - is one such method. The author sees no one method better than another, and suggests that combining two or more of the methods to be a benefit.
Resumo:
BACKGROUND: The American College of Cardiology guidelines recommend 3 months of anticoagulation after replacement of the aortic valve with a bioprosthesis. However, there remains great variability in the current clinical practice and conflicting results from clinical studies. To assist clinical decision making, we pooled the existing evidence to assess whether anticoagulation in the setting of a new bioprosthesis was associated with improved outcomes or greater risk of bleeding. METHODS AND RESULTS: We searched the PubMed database from the inception of these databases until April 2015 to identify original studies (observational studies or clinical trials) that assessed anticoagulation with warfarin in comparison with either aspirin or no antiplatelet or anticoagulant therapy. We included the studies if their outcomes included thromboembolism or stroke/transient ischemic attacks and bleeding events. Quality assessment was performed in accordance with the Newland Ottawa Scale, and random effects analysis was used to pool the data from the available studies. I(2) testing was done to assess the heterogeneity of the included studies. After screening through 170 articles, a total of 13 studies (cases=6431; controls=18210) were included in the final analyses. The use of warfarin was associated with a significantly increased risk of overall bleeding (odds ratio, 1.96; 95% confidence interval, 1.25-3.08; P<0.0001) or bleeding risk at 3 months (odds ratio, 1.92; 95% confidence interval, 1.10-3.34; P<0.0001) compared with aspirin or placebo. With regard to composite primary outcome variables (risk of venous thromboembolism, stroke, or transient ischemic attack) at 3 months, no significant difference was seen with warfarin (odds ratio, 1.13; 95% confidence interval, 0.82-1.56; P=0.67). Moreover, anticoagulation was also not shown to improve outcomes at time interval >3 months (odds ratio, 1.12; 95% confidence interval, 0.80-1.58; P=0.79). CONCLUSIONS: Contrary to the current guidelines, a meta-analysis of previous studies suggests that anticoagulation in the setting of an aortic bioprosthesis significantly increases bleeding risk without a favorable effect on thromboembolic events. Larger, randomized controlled studies should be performed to further guide this clinical practice.
Resumo:
This article evaluates the performance of public service broadcasters in the area of children’s television in Italy and Spain. It asks: how distinctive is the output of public service children’s channels? As core area of public service provision, children’s television represents an important testing ground for wider debates about the distinctiveness of public service broadcasting in a digital age. Public broadcasters in Southern Europe have historically been more vulnerable to market pressure than their counterparts in continental and Northern Europe, and this is believed to have impacted negatively on their ability to maintain a distinctive public service profile. After engaging with debates on distinctiveness in order to develop a framework for the analysis, the article presents the results of a two-week analysis of the TV schedules of the main children’s channels operating in the two countries. It finds evidence that in both countries the output of public service children’s channels is distinctive to a degree, but also that there are important gaps in public service provision as well as some significant differences between the public service children’s channels analysed.