987 resultados para theoretical methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pénzügyekben mind elméletileg, mind az alkalmazások szempontjából fontos kérdés a tőkeallokáció. Hogyan osszuk szét egy adott portfólió kockázatát annak alportfóliói között? Miként tartalékoljunk tőkét a fennálló kockázatok fedezetére, és a tartalékokat hogyan rendeljük az üzleti egységekhez? A tőkeallokáció vizsgálatára axiomatikus megközelítést alkalmazunk, tehát alapvető tulajdonságok megkövetelésével dolgozunk. Cikkünk kiindulópontja Csóka-Pintér [2010] azon eredménye, hogy a koherens kockázati mértékek axiómái, valamint a tőkeallokációra vonatkozó méltányossági, ösztönzési és stabilitási követelmények nincsenek összhangban egymással. Ebben a cikkben analitikus és szimulációs eszközökkel vizsgáljuk ezeket a követelményeket. A gyakorlati alkalmazások során használt, illetve az elméleti szempontból érdekes tőkeallokációs módszereket is elemezzük. A cikk fő következtetése, hogy a Csóka-Pintér [2010] által felvetett probléma gyakorlati szempontból is releváns, tehát az nemcsak az elméleti vizsgálatok során merül fel, hanem igen sokszor előforduló és gyakorlati probléma. A cikk további eredménye, hogy a vizsgált tőkeallokációs módszerek jellemzésével segítséget nyújt az alkalmazóknak a különböző módszerek közötti választáshoz. / === / Risk capital allocation in finance is important theoretically and also in practical applications. How can the risk of a portfolio be shared among its sub-portfolios? How should the capital reserves be set to cover risks, and how should the reserves be assigned to the business units? The study uses an axiomatic approach to analyse risk capital allocation, by working with requiring basic properties. The starting point is a 2010 study by Csoka and Pinter (2010), who showed that the axioms of coherent measures of risk are not compatible with some fairness, incentive compatibility and stability requirements of risk allocation. This paper discusses these requirements using analytical and simulation tools. It analyses methods used in practical applications that have theoretically interesting properties. The main conclusion is that the problems identified in Csoka and Pinter (2010) remain relevant in practical applications, so that it is not just a theoretical issue, it is a common practical problem. A further contribution is made because analysis of risk allocation methods helps practitioners choose among the different methods available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cikk a páros összehasonlításokon alapuló pontozási eljárásokat tárgyalja axiomatikus megközelítésben. A szakirodalomban számos értékelő függvényt javasoltak erre a célra, néhány karakterizációs eredmény is ismert. Ennek ellenére a megfelelő módszer kiválasztása nem egy-szerű feladat, a különböző tulajdonságok bevezetése elsősorban ebben nyújthat segítséget. Itt az összehasonlított objektumok teljesítményén érvényesülő monotonitást tárgyaljuk az önkonzisztencia és önkonzisztens monotonitás axiómákból kiindulva. Bemutatásra kerülnek lehetséges gyengítéseik és kiterjesztéseik, illetve egy, az irreleváns összehasonlításoktól való függetlenséggel kapcsolatos lehetetlenségi tétel is. A tulajdonságok teljesülését három eljárásra, a klasszikus pontszám eljárásra, az ezt továbbfejlesztő általánosított sorösszegre és a legkisebb négyzetek módszerére vizsgáljuk meg, melyek mindegyike egy lineáris egyenletrendszer megoldásaként számítható. A kapott eredmények új szempontokkal gazdagítják a pontozási eljárás megválasztásának kérdését. _____ The paper provides an axiomatic analysis of some scoring procedures based on paired comparisons. Several methods have been proposed for these generalized tournaments, some of them have been also characterized by a set of properties. The choice of an appropriate method is supported by a discussion of their theoretical properties. In the paper we focus on the connections of self-consistency and self-consistent-monotonicity, two axioms based on the comparisons of object's performance. The contradiction of self-consistency and independence of irrel-evant matches is revealed, as well as some possible reductions and extensions of these properties. Their satisfiability is examined through three scoring procedures, the score, generalised row sum and least squares methods, each of them is calculated as a solution of a system of linear equations. Our results contribute to the problem of finding a proper paired comparison based scoring method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A páronként összehasonlított alternatívák rangsorolásának problémája egyaránt felmerül a szavazáselmélet, a statisztika, a tudománymetria, a pszichológia és a sport területén. A nemzetközi szakirodalom alapján részletesen áttekintjük a megoldási lehetőségeket, bemutatjuk a gyakorlati alkalmazások során fellépő kérdések kezelésének, a valós adatoknak megfelelő matematikai környezet felépítésének módjait. Kiemelten tárgyaljuk a páros összehasonlítási mátrix megadását, az egyes pontozási eljárásokat és azok kapcsolatát. A tanulmány elméleti szempontból vizsgálja a Perron-Frobenius tételen alapuló invariáns, fair bets, PageRank, valamint az irányított gráfok csúcsainak rangsorolásra javasolt internal slackening és pozíciós erő módszereket. A közülük történő választáshoz az axiomatikus megközelítést ajánljuk, ennek keretében bemutatjuk az invariáns és a fair bets eljárások karakterizációját, és kitérünk a módszerek vitatható tulajdonságaira. _____ The ranking of the alternatives or selecting the best one are fundamental issues of social choice theory, statistics, psychology and sport. Different solution concepts, and various mathematical models of applications are reviewed based on the international literature. We are focusing on the de¯nition of paired comparison matrix, on main scoring procedures and their relation. The paper gives a theoretical analysis of the invariant, fair bets and PageRank methods, which are founded on Perron-Frobenius theorem, as well as the internal slackening and positional power procedures used for ranking the nodes of a directed graph. An axiomatic approach is proposed for the choice of an appropriate method. Besides some known characterizations for the invariant and fair bets methods, we also discuss the violation of some properties, meaning their main weakness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In finance risk capital allocation raises important questions both from theoretical and practical points of view. How to share risk of a portfolio among its subportfolios? How to reserve capital in order to hedge existing risk and how to assign this to different business units? We use an axiomatic approach to examine risk capital allocation, that is we call for fundamental properties of the methods. Our starting point is Csóka and Pintér (2011) who show by generalizing Young (1985)'s axiomatization of the Shapley value that the requirements of Core Compatibility, Equal Treatment Property and Strong Monotonicity are irreconcilable given that risk is quantified by a coherent measure of risk. In this paper we look at these requirements using analytic and simulations tools. We examine allocation methods used in practice and also ones which are theoretically interesting. Our main result is that the problem raised by Csóka and Pintér (2011) is indeed relevant in practical applications, that is it is not only a theoretical problem. We also believe that through the characterizations of the examined methods our paper can serve as a useful guide for practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had two purposes: (a) to develop a theoretical framework integrating and synthesizing findings of prior research regarding stress and burnout among critical care nurses (CCRNs), and (b) to validate the theoretical framework with an empirical study to assure a theory/research based teaching-learning process for graduate courses preparing nursing clinical specialists and administrators.^ The methods used to test the theoretical framework included: (a) adopting instruments with reported validity, (b) conducting a pilot study, (c) revising instruments using results of the pilot study and following concurrence of a panel of experts, and (d) establishing correlations within predetermined parameters. The reliability of the tool was determined through the use of Cronbach's Alpha Coefficient with a resulting range from.68 to.88 for all measures.^ The findings supported all the research hypotheses. Correlations were established at r =.23 for statistically significant alphas at the.01 level and r =.16 for alphas.05. The conclusions indicated three areas of strong correlation among the theoretical variables: (a) work environment stressor antecedents and specific stressor events were correlated significantly with subjective work stress and burnout; (b) subjective work stress (perceived work related stress) was a function of the work environment stressor antecedents and specific stressor events, and (c) emotional exhaustion, the first phase of burnout, was confirmed to be related to stressor antecedents and specific stressor events. This dimension was found to be a function of the work environment stressor antecedents, modified by the individual characteristics of work and non-work related social support, non-work daily stress, and the number of hours worked per week. The implications of the study for nursing graduate curricula, nursing practice and nursing education were discussed. Recommendations for further research were enumerated. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Police often use facial composites during their investigations, yet research suggests that facial composites are generally not effective. The present research included two experiments on facial composites. The first experiment was designed to test the usefulness of the encoding specificity principle for determining when facial composites will be effective. Instructions were used to encourage holistic or featural cues at encoding. The method used to construct facial composites was manipulated to encourage holistic or featural cues at retrieval. The encoding specificity principle suggests that an interaction effect should occur. If the same cues are used at encoding and retrieval, better composites should be constructed than when the cues are not the same. However, neither the expected interaction nor the main effects for encoding and retrieval were significant. The second study was conducted to assess the effectiveness of composites generated by two different facial composite construction systems, E-Fit and Mac-A-Mug Pro. These systems differ in that the E-Fit system uses more sophisticated methods of composite construction and may construct better quality facial composites. A comparison of E-Fit and Mac-A-Mug Pro composites demonstrated that E-Fit composites were of better quality than Mac-A-Mug Pro composites. However, neither E-Fit nor Mac-A-Mug Pro composites were useful for identifying the target person from a photograph lineup. Further, lineup performance was at floor level such that both E-Fit and Mac-A-Mug Pro composites were no more useful than a verbal description. Possible limitations of the studies are discussed, as well as suggestions for future research. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contractile state of microcirculatory vessels is a major determinant of the blood pressure of the whole systemic circulation. Continuous bi-directional communication exists between the endothelial cells (ECs) and smooth muscle cells (SMCs) that regulates calcium (Ca2+) dynamics in these cells. This study presents theoretical approaches to understand some of the important and currently unresolved microcirculatory phenomena. ^ Agonist induced events at local sites have been shown to spread long distances in the microcirculation. We have developed a multicellular computational model by integrating detailed single EC and SMC models with gap junction and nitric oxide (NO) coupling to understand the mechanisms behind this effect. Simulations suggest that spreading vasodilation mainly occurs through Ca 2+ independent passive conduction of hyperpolarization in RMAs. Model predicts a superior role for intercellular diffusion of inositol (1,4,5)-trisphosphate (IP3) than Ca2+ in modulating the spreading response. ^ Endothelial derived signals are initiated even during vasoconstriction of stimulated SMCs by the movement of Ca2+ and/or IP3 into the EC which provide hyperpolarizing feedback to SMCs to counter the ongoing constriction. Myoendothelial projections (MPs) present in the ECs have been recently proposed to play a role in myoendothelial feedback. We have developed two models using compartmental and 2D finite element methods to examine the role of these MPs by adding a sub compartment in the EC to simulate MP with localization of intermediate conductance calcium activated potassium channels (IKCa) and IP3 receptors (IP 3R). Both models predicted IP3 mediated high Ca2+ gradients in the MP after SMC stimulation with limited global spread. This Ca 2+ transient generated a hyperpolarizing feedback of ∼ 2–3mV. ^ Endothelium derived hyperpolarizing factor (EDHF) is the dominant form of endothelial control of SMC constriction in the microcirculation. A number of factors have been proposed for the role of EDHF but no single pathway is agreed upon. We have examined the potential of myoendothelial gap junctions (MEGJs) and potassium (K+) accumulation as EDHF using two models (compartmental and 2D finite element). An extra compartment is added in SMC to simulate micro domains (MD) which have NaKα2 isoform sodium potassium pumps. Simulations predict that MEGJ coupling is much stronger in producing EDHF than alone K+ accumulation. On the contrary, K+ accumulation can alter other important parameters (EC V m, IKCa current) and inhibit its own release as well as EDHF conduction via MEGJs. The models developed in this study are essential building blocks for future models and provide important insights to the current understanding of myoendothelial feedback and EDHF.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing body of literature that provides evidence for the efficacy of positive youth development programs in general and preliminary empirical support for the efficacy of the Changing Lives Program (CLP) in particular. This dissertation sought to extend previous efforts to develop and preliminarily examine the Transformative Goal Attainment Scale (TGAS) as a measure of participant empowerment in the promotion of positive development. Consistent with recent advances in the use of qualitative research methods, this dissertation sought to further investigate the utility of Relational Data Analysis (RDA) for providing categorizations of qualitative open-ended response data. In particular, a qualitative index of Transformative Goals, TG, was developed to complement the previously developed quantitative index of Transformative Goal Attainment (TGA), and RDA procedures for calculating reliability and content validity were refined. Second, as a Stage I pilot/feasibility study this study preliminarily examined the potentially mediating role of empowerment, as indexed by the TGAS, in the promotion of positive development. ^ Fifty-seven participants took part in this study, forty CLP intervention participants and seventeen control condition participants. All 57 participants were administered the study's measures just prior to and just following the fall 2003 semester. This study thus used a short-term longitudinal quasi-experimental research design with a comparison control group. ^ RDA procedures were refined and applied to the categorization of open-ended response data regarding participants' transformative goals (TG) and future possible selves (PSQ-QE). These analyses revealed relatively strong, indirect evidence for the construct validity of the categories as well as their theoretically meaningful structural organization, thereby providing sufficient support for the utility of RDA procedures in the categorization of qualitative open-ended response data. ^ In addition, transformative goals (TG) and future possible selves (PSQ-QE), and the quantitative index of perceived goal attainment (TGA) were evaluated as potential mediators of positive development by testing their relationships to other indices of positive intervention outcome within a four-step method involving both analysis of variance (ANOVA and RMANOVAs) and regression analysis. Though more limited in scope than the efforts at the development and refinement of the measures of these mediators, the results were also promising. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces a new approach for assessing the effects of pediatric epilepsy on the language connectome. Two novel data-driven network construction approaches are presented. These methods rely on connecting different brain regions using either extent or intensity of language related activations as identified by independent component analysis of fMRI data. An auditory description decision task (ADDT) paradigm was used to activate the language network for 29 patients and 30 controls recruited from three major pediatric hospitals. Empirical evaluations illustrated that pediatric epilepsy can cause, or is associated with, a network efficiency reduction. Patients showed a propensity to inefficiently employ the whole brain network to perform the ADDT language task; on the contrary, controls seemed to efficiently use smaller segregated network components to achieve the same task. To explain the causes of the decreased efficiency, graph theoretical analysis was carried out. The analysis revealed no substantial global network feature differences between the patient and control groups. It also showed that for both subject groups the language network exhibited small-world characteristics; however, the patient's extent of activation network showed a tendency towards more random networks. It was also shown that the intensity of activation network displayed ipsilateral hub reorganization on the local level. The left hemispheric hubs displayed greater centrality values for patients, whereas the right hemispheric hubs displayed greater centrality values for controls. This hub hemispheric disparity was not correlated with a right atypical language laterality found in six patients. Finally it was shown that a multi-level unsupervised clustering scheme based on self-organizing maps, a type of artificial neural network, and k-means was able to fairly and blindly separate the subjects into their respective patient or control groups. The clustering was initiated using the local nodal centrality measurements only. Compared to the extent of activation network, the intensity of activation network clustering demonstrated better precision. This outcome supports the assertion that the local centrality differences presented by the intensity of activation network can be associated with focal epilepsy.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Czech schools two teaching methods of reading are used: the analytic-synthetic (conventional) and genetic (created in the 1990s). They differ in theoretical foundations and in methodology. The aim of this paper is to describe the above mentioned theoretical approaches and present the results of study that followed the differences in the development of initial reading skills between these methods. A total of 452 first grade children (age 6-8) were assessed by a battery of reading tests at the beginning and at the end of the first grade and at the beginning of the second grade. 350 pupils participated all three times. Based on data analysis the developmental dynamics of reading skills in both methods and the main differences in several aspects of reading abilities (e.g. the speed of reading, reading technique, error rate in reading) are described. The main focus is on the reading comprehension development. Results show that pupils instructed using genetic approach scored significantly better on used reading comprehension tests, especially in the first grade. Statistically significant differences occurred between classes independently of each method. Therefore, other factors such as teacher´s role and class composition are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the a posteriori and a priori error analysis of discontinuous Galerkin interior penalty methods for second-order partial differential equations with nonnegative characteristic form on anisotropically refined computational meshes. In particular, we discuss the question of error estimation for linear target functionals, such as the outflow flux and the local average of the solution. Based on our a posteriori error bound we design and implement the corresponding adaptive algorithm to ensure reliable and efficient control of the error in the prescribed functional to within a given tolerance. This involves exploiting both local isotropic and anisotropic mesh refinement. The theoretical results are illustrated by a series of numerical experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the a priori error analysis of hp-version interior penalty discontinuous Galerkin methods for second-order partial differential equations with nonnegative characteristic form under weak assumptions on the mesh design and the local finite element spaces employed. In particular, we prove a priori hp-error bounds for linear target functionals of the solution, on (possibly) anisotropic computational meshes with anisotropic tensor-product polynomial basis functions. The theoretical results are illustrated by a numerical experiment.