813 resultados para Discrete Choice
Resumo:
An attempt is made by the researcher to establish a theory of discrete functions in the complex plane. Classical analysis q-basic theory, monodiffric theory, preholomorphic theory and q-analytic theory have been utilised to develop concepts like differentiation, integration and special functions.
Resumo:
This thesis is an attempt to initiate the development of a discrete geometry of the discrete plane H = {(qmxo,qnyo); m,n e Z - the set of integers}, where q s (0,1) is fixed and (xO,yO) is a fixed point in the first quadrant of the complex plane, xo,y0 ¢ 0. The discrete plane was first considered by Harman in 1972, to evolve a discrete analytic function theory for geometric difference functions. We shall mention briefly, through various sections, the principle of discretization, an outline of discrete a alytic function theory, the concept of geometry of space and also summary of work done in this thesis
Resumo:
There is a recent trend to describe physical phenomena without the use of infinitesimals or infinites. This has been accomplished replacing differential calculus by the finite difference theory. Discrete function theory was first introduced in l94l. This theory is concerned with a study of functions defined on a discrete set of points in the complex plane. The theory was extensively developed for functions defined on a Gaussian lattice. In 1972 a very suitable lattice H: {Ci qmxO,I qnyo), X0) 0, X3) 0, O < q < l, m, n 5 Z} was found and discrete analytic function theory was developed. Very recently some work has been done in discrete monodiffric function theory for functions defined on H. The theory of pseudoanalytic functions is a generalisation of the theory of analytic functions. When the generator becomes the identity, ie., (l, i) the theory of pseudoanalytic functions reduces to the theory of analytic functions. Theugh the theory of pseudoanalytic functions plays an important role in analysis, no discrete theory is available in literature. This thesis is an attempt in that direction. A discrete pseudoanalytic theory is derived for functions defined on H.
Resumo:
The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.
Resumo:
This paper compares the most common digital signal processing methods of exon prediction in eukaryotes, and also proposes a technique for noise suppression in exon prediction. The specimen used here which has relevance in medical research, has been taken from the public genomic database - GenBank.Here exon prediction has been done using the digital signal processing methods viz. binary method, EIIP (electron-ion interaction psuedopotential) method and filter methods. Under filter method two filter designs, and two approaches using these two designs have been tried. The discrete wavelet transform has been used for de-noising of the exon plots.Results of exon prediction based on the methods mentioned above, which give values closest to the ones found in the NCBI database are given here. The exon plot de-noised using discrete wavelet transform is also given.Alterations to the proven methods as done by the authors, improves performance of exon prediction algorithms. Also it has been proven that the discrete wavelet transform is an effective tool for de-noising which can be used with exon prediction algorithms
Resumo:
Enhancement of financial inclusivity of rural communities is often recognised as a key strategy for achieving economic development in third world countries. The main objective of this study was to examine the factors that influence consumers’ choice of a rural bank in Gicumbi district of Rwanda. Data was collected using structured questionnaires and analysed using a binary probit regression model and non-parametric procedures. Most consumers were aware of Popular Bank of Rwanda (BPR) and Umurenge SACCO through radio advertisements, social networks and community meetings. Accessibility, interest rates and quality of services influenced choice of a given financial intermediary. Moreover, the decision to open a rural bank account was significantly influenced by education and farm size (p<0.1). These results indicate the need for financial managers to consider these findings for successful marketing campaigns.
Resumo:
When triangulating a belief network we aim to obtain a junction tree of minimum state space. Searching for the optimal triangulation can be cast as a search over all the permutations of the network's vaeriables. Our approach is to embed the discrete set of permutations in a convex continuous domain D. By suitably extending the cost function over D and solving the continous nonlinear optimization task we hope to obtain a good triangulation with respect to the aformentioned cost. In this paper we introduce an upper bound to the total junction tree weight as the cost function. The appropriatedness of this choice is discussed and explored by simulations. Then we present two ways of embedding the new objective function into continuous domains and show that they perform well compared to the best known heuristic.
Resumo:
This paper examines a dataset which is modeled well by the Poisson-Log Normal process and by this process mixed with Log Normal data, which are both turned into compositions. This generates compositional data that has zeros without any need for conditional models or assuming that there is missing or censored data that needs adjustment. It also enables us to model dependence on covariates and within the composition
Resumo:
A joint distribution of two discrete random variables with finite support can be displayed as a two way table of probabilities adding to one. Assume that this table has n rows and m columns and all probabilities are non-null. This kind of table can be seen as an element in the simplex of n · m parts. In this context, the marginals are identified as compositional amalgams, conditionals (rows or columns) as subcompositions. Also, simplicial perturbation appears as Bayes theorem. However, the Euclidean elements of the Aitchison geometry of the simplex can also be translated into the table of probabilities: subspaces, orthogonal projections, distances. Two important questions are addressed: a) given a table of probabilities, which is the nearest independent table to the initial one? b) which is the largest orthogonal projection of a row onto a column? or, equivalently, which is the information in a row explained by a column, thus explaining the interaction? To answer these questions three orthogonal decompositions are presented: (1) by columns and a row-wise geometric marginal, (2) by rows and a columnwise geometric marginal, (3) by independent two-way tables and fully dependent tables representing row-column interaction. An important result is that the nearest independent table is the product of the two (row and column)-wise geometric marginal tables. A corollary is that, in an independent table, the geometric marginals conform with the traditional (arithmetic) marginals. These decompositions can be compared with standard log-linear models. Key words: balance, compositional data, simplex, Aitchison geometry, composition, orthonormal basis, arithmetic and geometric marginals, amalgam, dependence measure, contingency table
Resumo:
In most psychological tests and questionnaires, a test score is obtained by taking the sum of the item scores. In virtually all cases where the test or questionnaire contains multidimensional forced-choice items, this traditional scoring method is also applied. We argue that the summation of scores obtained with multidimensional forced-choice items produces uninterpretable test scores. Therefore, we propose three alternative scoring methods: a weak and a strict rank preserving scoring method, which both allow an ordinal interpretation of test scores; and a ratio preserving scoring method, which allows a proportional interpretation of test scores. Each proposed scoring method yields an index for each respondent indicating the degree to which the response pattern is inconsistent. Analysis of real data showed that with respect to rank preservation, the weak and strict rank preserving method resulted in lower inconsistency indices than the traditional scoring method; with respect to ratio preservation, the ratio preserving scoring method resulted in lower inconsistency indices than the traditional scoring method
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Exercises and solutions for a third or fourth year maths course. Diagrams for the questions are all together in the support.zip file, as .eps files
Resumo:
Este documento presenta una revisión de las principales aproximaciones teóricas sobre recursos humanos en ciencia y tecnología y la modelación empírica de las carreras académicas y científi cas utilizando los Curriculum Vitae (CV) como fuente de información principal. Adicionalmente, muestra los resultados de varios estudios realizados en Colombia basados en la teoría del capital conocimiento. Estos estudios han permitido establecer una línea de investigación sobre la evaluación del comportamiento de los recursos humanos, el tránsito hacia comunidades científi cas y el estudio de las carreras académicas de los investigadores. Adicionalmente, muestran que la información contenida en la Plataforma ScienTI (Grup-Lac y Cv-Lac) permite establecer de manera concreta las capacidades científi cas y tecnológicas del país. Palabras claves: Recursos humanos, carreras académicas y científi cas, regresión discreta y modelos de elección cualitativa. Clasifi cación JEL: C25, O15.
Resumo:
The present study aimed to assess the tolerance and efficacy of rituximab (RTX), a chimeric IgG1 monoclonal antibody directed against the CD20 receptor present in B lymphocytes, in patients with autoimmune rheumatic diseases (AIRD). For this purpose, patients treated with RTX and their respective clinical charts were comprehensively examined. Indications for treatment were a refractory character of the disease, inefficacy or intolerance of other immunosuppressors. Activity indexes (SLEDAI, DAS28, and specific clinical manifestations) were used to evaluate efficacy. Serious side effects were also recorded. Seventy-four patients were included. Forty-three patients had systemic lupus erythematosus (SLE), 21 had rheumatoid arthritis (RA), 8 had Sjögren’s syndrome (SS), and 2 had Takayasu’s arteritis (TA). RTX was well-tolerated in 66 (89%) patients. In 8 patients (SLE = 3, SS = 3, RA = 2), serious side effects lead to discontinuation. The mean follow-up period was 12 ± 7.8 (2–35) months. The efficacy of RTX was registered in 58/66 (87%) patients, of whom 36 (83%) had SLE, 18/21 (85%) had RA, 3/8 (37%) had SS, and 1 had TA. The mean time of efficacy was 6.3 ± 5.1 weeks. A significant steroid-sparing effect was noticed in half of the patients. These results add further evidence for the use of RTX in AIRD. Based on its risk–benefit ratio, RTX might be used as the first-choice treatment for patients with severe AIRD.
Resumo:
Los procesos transnacionales han marcado un cambio en las relaciones entre los actores del sistema internacional, permitiendo el trabajo por diversas causas a través de las fronteras. Esto ha sido aprovechado por los movimientos sociales, para que su lucha no quede enmarcada simplemente en su país, sino que a partir de objetivos, problemáticas, valores y acciones similares se vea reflejado en diferentes Estados y se de una acción común y colectiva para generar un cambio. Este fenómeno ha sido tomado como referente el Movimiento Pro-choice para articularse transnacionalmente en Colombia para la promoción de los Derechos Sexuales y Reproductivos en el periodo de 2001 a 2011, alcanzando una serie de objetivos importantes que han permitido cambios legales al interior del país, generando también un cambio dentro de la sociedad colombiana. El estudio, análisis y comprensión de la articulación del movimiento prochoice a partir de una dinámica transnacional para la promoción de los derechos sexuales y reproductivos en Colombia, se perfila como un tema de importancia por su coyuntura actual en el mundo, puesto que ha estado latente en los últimos veinte años. Igualmente, la identificación de la acción de los MST como otros actores internacionales en la transformación de las sociedades tanto locales como internacionales, traducido como un fenómeno que se puede explicar dentro de las Relaciones Internacionales.