968 resultados para Analytic function theory,


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of biomarkers of vascular cognitive impairment is urgent for its early diagnosis. The aim of this study was to detect and monitor changes in brain structure and connectivity, and to correlate them with the decline in executive function. We examined the feasibility of early diagnostic magnetic resonance imaging (MRI) to predict cognitive impairment before onset in an animal model of chronic hypertension: Spontaneously Hypertensive Rats. Cognitive performance was tested in an operant conditioning paradigm that evaluated learning, memory, and behavioral flexibility skills. Behavioral tests were coupled with longitudinal diffusion weighted imaging acquired with 126 diffusion gradient directions and 0.3 mm(3) isometric resolution at 10, 14, 18, 22, 26, and 40 weeks after birth. Diffusion weighted imaging was analyzed in two different ways, by regional characterization of diffusion tensor imaging (DTI) indices, and by assessing changes in structural brain network organization based on Q-Ball tractography. Already at the first evaluated times, DTI scalar maps revealed significant differences in many regions, suggesting loss of integrity in white and gray matter of spontaneously hypertensive rats when compared to normotensive control rats. In addition, graph theory analysis of the structural brain network demonstrated a significant decrease of hierarchical modularity, global and local efficacy, with predictive value as shown by regional three-fold cross validation study. Moreover, these decreases were significantly correlated with the behavioral performance deficits observed at subsequent time points, suggesting that the diffusion weighted imaging and connectivity studies can unravel neuroimaging alterations even overt signs of cognitive impairment become apparent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistics management is increasingly being recognised by many companies to be of critical concern. The logistics function includes directly or indirectly many of the new areas for achieving or maintaining competitive advantage that companies have been forced to develop due to increasing competitive pressures. The key to achieving a competitive advantage is to manage the logistics function strategically which involves determining the most cost effective method of providing the necessary customer service levels from the many combinations of operating procedures in the areas of transportation, warehousing, order processing and information systems, production, and inventory management. In this thesis, a comprehensive distribution logistics strategic management process is formed by integrating the periodic strategic planning process with a continuous strategic issues management process. Strategic planning is used for defining the basic objectives for a company and assuring co operation and synergy between the different functions of a company while strategic issues management is used on a continuous basis in order to deal with environmental and internal turbulence. The strategic planning subprocess consists of the following main phases: (1) situational analyses, (2) defining the vision and strategic goals for the logistics function, (3) determining objectives and strategies, (4) drawing up tactical action plans, and (5) evaluating the implementation of the plans and making the needed adjustments. The aim of the strategic issues management subprocess is to continuously scan the environment and the organisation for early identification of the issues having a significant impact on the logistics function using the following steps: (1) the identification of trends, (2) assessing the impact and urgency of the identified trends, (3) assigning priorities to the issues, and (4) planning responses to the, issues. The Analytic Hierarchy Process (AHP) is a systematic procedure for structuring any problem. AHP is based on the following three principles: decomposition, comparative judgements, and synthesis of priorities. AHP starts by decomposing a complex, multicriteria problem into a hierarchy where each level consists of a few manageable elements which are then decomposed into another set of elements. The second step is to use a measurement methodology to establish priorities among the elements within each level of the hierarchy. The third step in using AHP is to synthesise the priorities of the elements to establish the overall priorities for the decision alternatives. In this thesis, decision support systems are developed for different areas of distribution logistics strategic management by applying the Analytic Hierarchy Process. The areas covered are: (1) logistics strategic issues management, (2) planning of logistic structure, (3) warehouse site selection, (4) inventory forecasting, (5) defining logistic action and development plans, (6) choosing a distribution logistics strategy, (7) analysing and selecting transport service providers, (8) defining the logistic vision and strategic goals, (9) benchmarking logistic performance, and (10) logistic service management. The thesis demonstrates the potential of AHP as a systematic and analytic approach to distribution logistics strategic management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research aimed to compare two female broiler breeder ages during the incubation period regarding management using the Analytic Hierarchy Process method (AHP). This method is characterized by the possibility of analyzing a multicriteria problem and assists a decision making. This study was carried out on a commercial hatchery located in São Paulo, Brazil. Two ages of broiler breeder (42 and 56 weeks) were compared relative to production rate. Production index data were the same in both ages and were submitted to multicriteria decision analysis using the AHP method. The results indicate that broiler breeders of 42 weeks presented better performance than those of 56 week-old. The setter phase (incubation) is more critical than the hatcher. The AHP method was efficient for this analysis and can serve as a methodological basis for future studies to improve the hatchability of broilers eggs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Permanent magnet synchronous machines (PMSM) have become widely used in applications because of high efficiency compared to synchronous machines with exciting winding or to induction motors. This feature of PMSM is achieved through the using the permanent magnets (PM) as the main excitation source. The magnetic properties of the PM have significant influence on all the PMSM characteristics. Recent observations of the PM material properties when used in rotating machines revealed that in all PMSMs the magnets do not necessarily operate in the second quadrant of the demagnetization curve which makes the magnets prone to hysteresis losses. Moreover, still no good analytical approach has not been derived for the magnetic flux density distribution along the PM during the different short circuits faults. The main task of this thesis is to derive simple analytical tool which can predict magnetic flux density distribution along the rotor-surface mounted PM in two cases: during normal operating mode and in the worst moment of time from the PM’s point of view of the three phase symmetrical short circuit. The surface mounted PMSMs were selected because of their prevalence and relatively simple construction. The proposed model is based on the combination of two theories: the theory of the magnetic circuit and space vector theory. The comparison of the results in case of the normal operating mode obtained from finite element software with the results calculated with the proposed model shows good accuracy of model in the parts of the PM which are most of all prone to hysteresis losses. The comparison of the results for three phase symmetrical short circuit revealed significant inaccuracy of the proposed model compared with results from finite element software. The analysis of the inaccuracy reasons was provided. The impact on the model of the Carter factor theory and assumption that air have permeability of the PM were analyzed. The propositions for the further model development are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study centers on the power of Right-Wing Authoritarianism (RWA) and Social Dominance Orientation (SDO) as predictors of prejudice against stereotypical and nonstereotypical homosexuals under the threat of death and the threat of uncertainty. Right-wing authoritarianism (RWA) is an individual difference variable that measures the tendency for individuals to unquestionably follow those perceived to be authorities. Social Dominance Orientation (SDO) is an individual difference variable that measures the degree to which an individual prefers inequality among social groups. The RWA and SDO Scales are considered to be two of the strongest predictors of prejudice, such as prejudice against homosexuals. The study focuses on the unique predictive power of these two variables in predicting prejudice against homosexuals. The study also examines the role of situational threat in prejudice, specifically the threat of death (mortality salience) and the threat of uncertainty (uncertainty salience). Competing predictions from theories involving the threat of death (Terror Management Theory) and the threat of uncertainty (Uncertainty Management Theory) are also tested. The preference for expected information in the form of stereotypes concerning male homosexuals (that is, a stereotypical or non-stereotypical homosexual) were tested. The difference between the predictive power ofRWA and SDO was examined by measuring how these variables predict liking of a stereotypical or non-stereotypical homosexual under the threat of death, the threat of uncertainty, or a control condition. Along with completing a measure for RWA and a measure for SDO, participants were asked to think of their own death, of their being uncertain or about watching television then were asked to read about a week in the life of either a stereotypical or non-stereotypical male homosexual. Participants were then asked to evaluate the individual and his essay. Based on the participants' evaluations, results from 180 heterosexual university students show that RWA and SDO are strong predictors for disliking of a stereotypical homosexual under the threat of uncertainty and disliking of a non-stereotypical homosexual under the threat of death. Furthermore, however, results show that RWA is a particularly strong predictor of disliking of a stereotypical homosexual under the threat of uncertainty, whereas SDO is an exceptionally strong predictor of disliking of the non-stereotypical homosexual under the threat of death. This further adds to the notion that RWA and SDO are indeed unique predictors of prejudice. Implications are also explored, including the fact that the study simuhaneously examined the role of individual difference variables and situational threat variables, as well as exploratory analysis on Dominating Authoritarians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A general derivation of the anharmonic coefficients for a periodic lattice invoking the special case of the central force interaction is presented. All of the contributions to mean square displacement (MSD) to order 14 perturbation theory are enumerated. A direct correspondance is found between the high temperature limit MSD and high temperature limit free energy contributions up to and including 0(14). This correspondance follows from the detailed derivation of some of the contributions to MSD. Numerical results are obtained for all the MSD contributions to 0(14) using the Lennard-Jones potential for the lattice constants and temperatures for which the Monte Carlo results were calculated by Heiser, Shukla and Cowley. The Peierls approximation is also employed in order to simplify the numerical evaluation of the MSD contributions. The numerical results indicate the convergence of the perturbation expansion up to 75% of the melting temperature of the solid (TM) for the exact calculation; however, a better agreement with the Monte Carlo results is not obtained when the total of all 14 contributions is added to the 12 perturbation theory results. Using Peierls approximation the expansion converges up to 45% of TM• The MSD contributions arising in the Green's function method of Shukla and Hubschle are derived and enumerated up to and including 0(18). The total MSD from these selected contributions is in excellent agreement with their results at all temperatures. Theoretical values of the recoilless fraction for krypton are calculated from the MSD contributions for both the Lennard-Jones and Aziz potentials. The agreement with experimental values is quite good.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This essay reviews the decision-making process that led to India exploding a nuclear device in May, 1974. An examination of the Analytic, Cybernetic and Cognitive Theories of decision, will enable a greater understanding of the events that led up to the 1974 test. While each theory is seen to be only partially useful, it is only by synthesising the three theories that a comprehensive account of the 1974 test can be given. To achieve this analysis, literature on decision-making in national security issues is reviewed, as well as the domestic and international environment in which involved decisionmakers operated. Finally, the rationale for the test in 1974 is examined. The conclusion revealed is that the explosion of a nuclear device by India in 1974 was primarily related to improving Indian international prestige among Third World countries and uniting a rapidly disintegrating Indian societal consensus. In themselves, individual decision-making theories were found to be of little use, but a combination of the various elements allowed a greater comprehension of the events leading up to the test than might otherwise have been the case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical relational databases lack proper ways to manage certain real-world situations including imprecise or uncertain data. Fuzzy databases overcome this limitation by allowing each entry in the table to be a fuzzy set where each element of the corresponding domain is assigned a membership degree from the real interval [0…1]. But this fuzzy mechanism becomes inappropriate in modelling scenarios where data might be incomparable. Therefore, we become interested in further generalization of fuzzy database into L-fuzzy database. In such a database, the characteristic function for a fuzzy set maps to an arbitrary complete Brouwerian lattice L. From the query language perspectives, the language of fuzzy database, FSQL extends the regular Structured Query Language (SQL) by adding fuzzy specific constructions. In addition to that, L-fuzzy query language LFSQL introduces appropriate linguistic operations to define and manipulate inexact data in an L-fuzzy database. This research mainly focuses on defining the semantics of LFSQL. However, it requires an abstract algebraic theory which can be used to prove all the properties of, and operations on, L-fuzzy relations. In our study, we show that the theory of arrow categories forms a suitable framework for that. Therefore, we define the semantics of LFSQL in the abstract notion of an arrow category. In addition, we implement the operations of L-fuzzy relations in Haskell and develop a parser that translates algebraic expressions into our implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new theory of random consumer demand. The primitive is a collection of probability distributions, rather than a binary preference. Various assumptions constrain these distributions, including analogues of common assumptions about preferences such as transitivity, monotonicity and convexity. Two results establish a complete representation of theoretically consistent random demand. The purpose of this theory of random consumer demand is application to empirical consumer demand problems. To this end, the theory has several desirable properties. It is intrinsically stochastic, so the econometrician can apply it directly without adding extrinsic randomness in the form of residuals. Random demand is parsimoniously represented by a single function on the consumption set. Finally, we have a practical method for statistical inference based on the theory, described in McCausland (2004), a companion paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soit $\displaystyle P(z):=\sum_{\nu=0}^na_\nu z^{\nu}$ un polynôme de degré $n$ et $\displaystyle M:=\sup_{|z|=1}|P(z)|.$ Sans aucne restriction suplémentaire, on sait que $|P'(z)|\leq Mn$ pour $|z|\leq 1$ (inégalité de Bernstein). Si nous supposons maintenant que les zéros du polynôme $P$ sont à l'extérieur du cercle $|z|=k,$ quelle amélioration peut-on apporter à l'inégalité de Bernstein? Il est déjà connu [{\bf \ref{Mal1}}] que dans le cas où $k\geq 1$ on a $$(*) \qquad |P'(z)|\leq \frac{n}{1+k}M \qquad (|z|\leq 1),$$ qu'en est-il pour le cas où $k < 1$? Quelle est l'inégalité analogue à $(*)$ pour une fonction entière de type exponentiel $\tau ?$ D'autre part, si on suppose que $P$ a tous ses zéros dans $|z|\geq k \, \, (k\geq 1),$ quelle est l'estimation de $|P'(z)|$ sur le cercle unité, en terme des quatre premiers termes de son développement en série entière autour de l'origine. Cette thèse constitue une contribution à la théorie analytique des polynômes à la lumière de ces questions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La théorie de l'information quantique s'est développée à une vitesse fulgurante au cours des vingt dernières années, avec des analogues et extensions des théorèmes de codage de source et de codage sur canal bruité pour la communication unidirectionnelle. Pour la communication interactive, un analogue quantique de la complexité de la communication a été développé, pour lequel les protocoles quantiques peuvent performer exponentiellement mieux que les meilleurs protocoles classiques pour certaines tâches classiques. Cependant, l'information quantique est beaucoup plus sensible au bruit que l'information classique. Il est donc impératif d'utiliser les ressources quantiques à leur plein potentiel. Dans cette thèse, nous étudions les protocoles quantiques interactifs du point de vue de la théorie de l'information et étudions les analogues du codage de source et du codage sur canal bruité. Le cadre considéré est celui de la complexité de la communication: Alice et Bob veulent faire un calcul quantique biparti tout en minimisant la quantité de communication échangée, sans égard au coût des calculs locaux. Nos résultats sont séparés en trois chapitres distincts, qui sont organisés de sorte à ce que chacun puisse être lu indépendamment. Étant donné le rôle central qu'elle occupe dans le contexte de la compression interactive, un chapitre est dédié à l'étude de la tâche de la redistribution d'état quantique. Nous prouvons des bornes inférieures sur les coûts de communication nécessaires dans un contexte interactif. Nous prouvons également des bornes atteignables avec un seul message, dans un contexte d'usage unique. Dans un chapitre subséquent, nous définissons une nouvelle notion de complexité de l'information quantique. Celle-ci caractérise la quantité d'information, plutôt que de communication, qu'Alice et Bob doivent échanger pour calculer une tâche bipartie. Nous prouvons beaucoup de propriétés structurelles pour cette quantité, et nous lui donnons une interprétation opérationnelle en tant que complexité de la communication quantique amortie. Dans le cas particulier d'entrées classiques, nous donnons une autre caractérisation permettant de quantifier le coût encouru par un protocole quantique qui oublie de l'information classique. Deux applications sont présentées: le premier résultat général de somme directe pour la complexité de la communication quantique à plus d'une ronde, ainsi qu'une borne optimale, à un terme polylogarithmique près, pour la complexité de la communication quantique avec un nombre de rondes limité pour la fonction « ensembles disjoints ». Dans un chapitre final, nous initions l'étude de la capacité interactive quantique pour les canaux bruités. Étant donné que les techniques pour distribuer de l'intrication sont bien étudiées, nous nous concentrons sur un modèle avec intrication préalable parfaite et communication classique bruitée. Nous démontrons que dans le cadre plus ardu des erreurs adversarielles, nous pouvons tolérer un taux d'erreur maximal de une demie moins epsilon, avec epsilon plus grand que zéro arbitrairement petit, et ce avec un taux de communication positif. Il s'ensuit que les canaux avec bruit aléatoire ayant une capacité positive pour la transmission unidirectionnelle ont une capacité positive pour la communication interactive quantique. Nous concluons avec une discussion de nos résultats et des directions futures pour ce programme de recherche sur une théorie de l'information quantique interactive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study on the characterization of probability distributions using the residual entropy function. The concept of entropy is extensively used in literature as a quantitative measure of uncertainty associated with a random phenomenon. The commonly used life time models in reliability Theory are exponential distribution, Pareto distribution, Beta distribution, Weibull distribution and gamma distribution. Several characterization theorems are obtained for the above models using reliability concepts such as failure rate, mean residual life function, vitality function, variance residual life function etc. Most of the works on characterization of distributions in the reliability context centers around the failure rate or the residual life function. The important aspect of interest in the study of entropy is that of locating distributions for which the shannon’s entropy is maximum subject to certain restrictions on the underlying random variable. The geometric vitality function and examine its properties. It is established that the geometric vitality function determines the distribution uniquely. The problem of averaging the residual entropy function is examined, and also the truncated form version of entropies of higher order are defined. In this study it is established that the residual entropy function determines the distribution uniquely and that the constancy of the same is characteristics to the geometric distribution