974 resultados para Linear decision rules


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The Any Qualified Provider framework in the National Health Service has changed the way adult audiology services are offered in England. Under the new rules, patients are being offered a choice in geographical location and audiology provider. This study aimed to explore how choices in treatment are presented and to identify what information patients need when they are seeking help with hearing loss. Design: This study adopted qualitative methods of ethnographic observations and focus group interviews to identify information needed prior to, and during, help-seeking. Observational data and focus group data were analysed using the constant comparison method of grounded theory. Study sample: Participants were recruited from a community Health and Social Care Trust in the west of England. This service incorporates both an Audiology and a Hearing Therapy service. Twenty seven participants were involved in focus groups or interviews. Results: Participants receive little information beyond the detail of hearing aids. Participants report little information that was not directly related to uptake of hearing aids. Conclusions: Participant preferences were not explored and limited information resulted in decisions that were clinician-led. The gaps in information reflect previous data on clinician communication and highlight the need for consistent information on a range of interventions to manage hearing loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Az elmúlt években hazánkban a svájcifrank- és a japánjen-alapú hitelezés gyakorlatilag megszűnt, a devizaalapú hitelek folyósításának feltételei szigorodtak, a már meglévő portfólió romlott, a hitelezők által elszenvedett veszteség megnőtt. A számviteli előírások azonban alig változtak, azaz a jelenlegi szabályozás képes a számviteli törvény által alapvető célként megjelölt megbízható és valós kép bemutatására. A számviteli megközelítés szerint a deviza- és a devizaalapú ügyletek között nincs lényegi különbség, az aktiválási, értékelési, valamint értékvesztés-képzési szabályok megegyeznek. A probléma nagyságrendjének bemutatása után ismertetem a devizás vagyonrészek értékelésével kapcsolatos szabályok változását azok indokaival együtt. Ezután bemutatom a különféle alkalmazott árfolyamok hatásait a beszámolóra, s az alkalmazható árfolyamok és a mérlegben megjelenő devizapozíciók összefüggéseit. A devizás követelésekre képzendő értékvesztés témakörben bemutatom az év végi zárási feladatok sorrendjét, valamint a deviza- és a devizaalapú ügyletek értékvesztése közötti különbségeket is. _______ Loan fi nancing in Swiss Franc and Japanese Yen has disappeared in the last few years, fi nancing in foreign currency has become more diffi cult, while the actual loan portfolio has worsened, losses born by fi nancial institutions have increased. Despite this, the accounting prescriptions have hardly changed, which can be seen as if the current regulation is able to provide the fair and true picture. According to the accounting approach, there is no material difference between FX and FX-denominated deals: rules on the recognition in the balance sheet, valuation and loan loss provisions are identical. In this article – after highlighting the magnitude of the problem -, I introduce the changes in the rules regarding items in foreign currency and the reasons behind those changes. In the next part, I investigate the impact of application of different FX rates on the fi nancial statement and their correspondence with the FX-position reported in the Balance sheet. Later, I discuss the adequate order of the periodical accounting closing tasks, and the differences between impairment of receivables to be settled and denominated in foreign currency, or only denominated in FX with Forint Cash Flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cikkünkben a magyar monetáris politikát vizsgáljuk olyan szempontból, hogy kamatdöntései meghozatalakor figyelembe vette-e az országkockázatot, és ha igen, hogyan. A kérdés megválaszolásához a monetáris politika elemzésének leggyakoribb eszközét használjuk: az ország monetáris politikáját leíró Taylor-szabályokat becslünk. A becslést több kockázati mérőszámmal is elvégeztük több, különféle Taylor-szabályt használva. Az érzékenységvizsgálatban az inflációhoz és a kibocsátási réshez is alkalmaztunk más, az alapspecifikációban szereplőtől eltérő mérőszámokat. Eredményeink szerint a Magyar Nemzeti Bank kamatdöntései jól leírhatók egy rugalmas, inflációs célkövető rezsimmel: a Taylor-szabályban szignifikáns szerepe van az inflációs céltól való eltérésének és - a szabályok egy része esetén - a kibocsátási résnek. Emellett a döntéshozók figyelembe vették az országkockázatot is, annak növekedésére a kamat emelésével válaszoltak. Az országkockázat Taylor-szabályba történő beillesztése a megfelelő kockázati mérőszám kiválasztása esetén jelentős mértékben képes javítani a Taylor-szabály illeszkedését. _____ The paper investigates the degree to which Hungarian monetary policy has considered country risk in its decisions and if so, how. The answer was sought through the commonest method of analysing a countrys monetary policy: Taylor rules for describing it. The estimation of the rule was prepared using several risk indicators and applying various types of Taylor rules. As a sensitivity analysis, other indicators of inflation and output gap were employed than in the base rule. This showed that the interest-rate decisions of the National Bank of Hungary can be well described by a flexible inflation targeting regime: in the Taylor rules, deviation of inflation from its target has a significant role and the output gap is also significant in one part of the rules. The decision-makers also considered country risk and responded to an increase in it by raising interest rates. Insertion of country risk into the Taylor rule could improve the models fit to an important degree when choosing an appropriate risk measure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The civil jury has been under attack in recent years for being unreliable and incompetent. Considering the myriad causes for poor civil juror decision-making, the current investigation explores both procedural and evidentiary issues that impact juror's decisions. Specifically, the first phase of this dissertation examines how jurors (mis)use evidence pertaining to the litigants when determining liability and awarding damages. After investigating how jurors utilize evidence, the focus shifts to exploring the utility of procedural reforms designed to improve decision-making (specifically revising the instructions on the laws in the case and bifurcating the damage phases of the trial). Using the results from the first two phases of the research, the final study involves manipulating pieces of evidence related to the litigants while exploring the effects that revising the judicial instructions have on the utilization of evidence in particular and on decision-making in general. ^ This dissertation was run on-line, allowing participants to access the study materials at their convenience. After giving consent, participants read the scenario of a fictitious product liability case with the litigant manipulations incorporated into the summary. Participants answered several attitudinal, case-specific, and comprehension questions, and were instructed to find in favor of one side and award any damages they felt warranted. Exploratory factor analyses, Probit and linear regressions, and path analyses were used to analyze the data (M-plus and SPSS were the software packages used to conduct the analyses). Results indicated that misuse of evidence was fairly frequent, though the mock jurors also utilized evidence appropriately. Although the results did not support bifurcation as a viable procedural reform, revising the judicial instructions did significantly increase comprehension rates. Trends in the data suggested that better decision-making occurred when the revised instructions were used, thus providing empirical support for this procedural reform as a means of improving civil jury decision-making. Implications for actual trials and attorneys are discussed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Road pricing has emerged as an effective means of managing road traffic demand while simultaneously raising additional revenues to transportation agencies. Research on the factors that govern travel decisions has shown that user preferences may be a function of the demographic characteristics of the individuals and the perceived trip attributes. However, it is not clear what are the actual trip attributes considered in the travel decision- making process, how these attributes are perceived by travelers, and how the set of trip attributes change as a function of the time of the day or from day to day. In this study, operational Intelligent Transportation Systems (ITS) archives are mined and the aggregated preferences for a priced system are extracted at a fine time aggregation level for an extended number of days. The resulting information is related to corresponding time-varying trip attributes such as travel time, travel time reliability, charged toll, and other parameters. The time-varying user preferences and trip attributes are linked together by means of a binary choice model (Logit) with a linear utility function on trip attributes. The trip attributes weights in the utility function are then dynamically estimated for each time of day by means of an adaptive, limited-memory discrete Kalman filter (ALMF). The relationship between traveler choices and travel time is assessed using different rules to capture the logic that best represents the traveler perception and the effect of the real-time information on the observed preferences. The impact of travel time reliability on traveler choices is investigated considering its multiple definitions. It can be concluded based on the results that using the ALMF algorithm allows a robust estimation of time-varying weights in the utility function at fine time aggregation levels. The high correlations among the trip attributes severely constrain the simultaneous estimation of their weights in the utility function. Despite the data limitations, it is found that, the ALMF algorithm can provide stable estimates of the choice parameters for some periods of the day. Finally, it is found that the daily variation of the user sensitivities for different periods of the day resembles a well-defined normal distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.

Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a numerical study of a linear compressor cascade to investigate the effective end wall profiling rules for highly-loaded axial compressors. The first step in the research applies a correlation analysis for the different flow field parameters by a data mining over 600 profiling samples to quantify how variations of loss, secondary flow and passage vortex interact with each other under the influence of a profiled end wall. The result identifies the dominant role of corner separation for control of total pressure loss, providing a principle that only in the flow field with serious corner separation does the does the profiled end wall change total pressure loss, secondary flow and passage vortex in the same direction. Then in the second step, a multi-objective optimization of a profiled end wall is performed to reduce loss at design point and near stall point. The development of effective end wall profiling rules is based on the manner of secondary flow control rather than the geometry features of the end wall. Using the optimum end wall cases from the Pareto front, a quantitative tool for analyzing secondary flow control is employed. The driving force induced by a profiled end wall on different regions of end wall flow are subjected to a detailed analysis and identified for their positive/negative influences in relieving corner separation, from which the effective profiling rules are further confirmed. It is found that the profiling rules on a cascade show distinct differences at design point and near stall point, thus loss control of different operating points is generally independent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article introduces the first findings of the Political Party Database Project, a major survey of party organizations in parliamentary and semi-presidential democracies. The project’s first round of data covers 122 parties in 19 countries. In this article, we describe the scope of the database, then investigate what it tells us about contemporary party organization in these countries, focusing on parties’ resources, structures and internal decision-making. We examine organizational patterns by country and party family, and where possible we make temporal comparisons with older data sets. Our analyses suggest a remarkable coexistence of uniformity and diversity. In terms of the major organizational resources on which parties can draw, such as members, staff and finance, the new evidence largely confirms the continuation of trends identified in previous research: that is, declining membership, but enhanced financial resources and more paid staff. We also find remarkable uniformity regarding the core architecture of party organizations. At the same time, however, we find substantial variation between countries and party families in terms of their internal processes, with particular regard to how internally democratic they are, and the forms that this democratization takes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the evolution of nowadays knowledge-based economies, the labour class becomes more competitive. As a way of getting skills that bring benefits to their careers, university students take advantage of the many opportunities available and go abroad to study. This study develops and empirically tests a structural model that examines the antecedents that influence the decision-making process of an Erasmus student under mobility for studies (EMS) in Aveiro, Coimbra and Porto (2014-2015). Reliability analysis, exploratory factor analysis and linear regressions were used to evaluate the model. Based on a survey with a sample of 872 valid responses, this study has demonstrated that EMS students are also influenced by touristic factors, which gives support to what has recently been approached by other authors. Conclusions and suggestions can be applied by other organizations, mainly Higher Education Institutions in order to attract more EMS students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Regional differences in physician supply can be found in many health care systems, regardless of their organizational and financial structure. A theoretical model is developed for the physicians' decision on office allocation, covering demand-side factors and a consumption time function. METHODS: To test the propositions following the theoretical model, generalized linear models were estimated to explain differences in 412 German districts. Various factors found in the literature were included to control for physicians' regional preferences. RESULTS: Evidence in favor of the first three propositions of the theoretical model could be found. Specialists show a stronger association to higher populated districts than GPs. Although indicators for regional preferences are significantly correlated with physician density, their coefficients are not as high as population density. CONCLUSIONS: If regional disparities should be addressed by political actions, the focus should be to counteract those parameters representing physicians' preferences in over- and undersupplied regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An early decision market is governed by rules that allow each student to apply to (at most) one college and require the student to attend this college if admitted. This market is ubiquitous in college admissions in the United States. We model this market as an extensive-form game of perfect information and study a refinement of subgame perfect equilibrium (SPE) that induces undominated Nash equilibria in every subgame (SPUE). Our main result shows that this game can be used to define a decentralized matching mechanism that weakly Pareto dominates student-proposing deferred acceptance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An early decision market is governed by rules that allow each student to apply to (at most) one college and require the student to attend this college if admitted. This market is ubiquitous in college admissions in the United States. We model this market as an extensive-form game of perfect information and study a refinement of subgame perfect equilibrium (SPE) that induces undominated Nash equilibria in every subgame (SPUE). Our main result shows that this game can be used to define a decentralized matching mechanism that weakly Pareto dominates student-proposing deferred acceptance.