987 resultados para General Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report on a review of selected general and application controls over the Iowa Public Employees’ Retirement System (IPERS) Legacy and I-Que Pension Administration Systems for the period May 16, 2011 through June 16, 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report on a review of selected general and application controls over the Iowa Department of Administrative Service’s (DAS) Human Resource Information System (HRIS), Payroll, Integrated Information for Iowa (I/3) and E-Payment Engine Systems for the periods April 13, 2009 through May 15, 2009 and April 5, 2010 through May 7, 2010

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From our reading over the current year 2010 we have singled out 8 items which seem to us significant for the practice of medicine. Small doses of colchicine are useful in the treatment of gout. No efficacious treatment for muscular cramps can be recommended. A cervical collar can be usefully prescribed for the treatment of cervical radiculopathy. A single dose of azithromycin can be envisaged as a third line treatment of syphilis. High doses of vitamin D should not be prescribed for the prevention of fractures in elderly women because of the risks of falling. The wearing of bifocals can be associated with these risks. A clinical score is available to help with the diagnosis of thoracic pain. The NT-pro BNP is of limited use for the follow-up of patients suffering from heart failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This brochure is the printed copy of the speech made by Hon. John A. Kasson to the Twentieth General Assembly for the Inauguration of the Iowa State Capitol

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extension of the self-consistent field approach formulation by Cohen in the preceding paper is proposed in order to include the most general kind of two-body interactions, i.e., interactions depending on position, momenta, spin, isotopic spin, etc. The dielectric function is replaced by a dielectric matrix. The evaluation of the energies involves the computation of a matrix inversion and trace.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss reality conditions and the relation between spacetime diffeomorphisms and gauge transformations in Ashtekars complex formulation of general relativity. We produce a general theoretical framework for the stabilization algorithm for the reality conditions, which is different from Diracs method of stabilization of constraints. We solve the problem of the projectability of the diffeomorphism transformations from configuration-velocity space to phase space, linking them to the reality conditions. We construct the complete set of canonical generators of the gauge group in the phase space which includes all the gauge variables. This result proves that the canonical formalism has all the gauge structure of the Lagrangian theory, including the time diffeomorphisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffeomorphism-induced symmetry transformations and time evolution are distinct operations in generally covariant theories formulated in phase space. Time is not frozen. Diffeomorphism invariants are consequently not necessarily constants of the motion. Time-dependent invariants arise through the choice of an intrinsic time, or equivalently through the imposition of time-dependent gauge fixation conditions. One example of such a time-dependent gauge fixing is the Komar-Bergmann use of Weyl curvature scalars in general relativity. An analogous gauge fixing is also imposed for the relativistic free particle and the resulting complete set time-dependent invariants for this exactly solvable model are displayed. In contrast with the free particle case, we show that gauge invariants that are simultaneously constants of motion cannot exist in general relativity. They vary with intrinsic time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iowa Code section 8D.10 requires certain state agencies prepare an annual report to the General Assembly certifying the identified savings associated with that state agency’s use of the Iowa Communications Network (ICN). This report covers estimated cost savings related to video conferencing via ICN for the Iowa Department of Transportation (Iowa DOT). In fiscal year 2011, the Iowa DOT did not conduct any sessions utilizing ICN’s video conferencing system, therefore, no cost savings were calculated for this report.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book is a reprint from the handwritten notes of Robert Lucas, first Governor of Iowa Territory from 1838 to 1841, written while he was in camp and on the march in the War of 1812 from the dates of April to September 1812.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The project "Quantification and qualification of ambulatory health care", financed by the Swiss National Science Foundation and covering the Cantons of Vaud and Fribourg, has two main goals: --a structural study of the elements of the ambulatory care sector. This is done through inventories of the professions concerned (physicians, public health nurses, physiotherapists, pharmacists, medical laboratories), allowing to better characterize the "offer". This inventory work includes the collect and analysis of existing statistical data as well as surveys, by questionnaires sent (from September 1980) to the different professions and by interviews. --a functional study, inspired from the US National Ambulatory Medical Care Survey and from similar studies elsewhere, in order to investigate the modes of practice of various providers, with particular regard to interprofessional collaboration (through studying referrals from the ones to the others). The first months of the project have been used for a methodological research in this regard, centered on the use of systems analysis, and for the elaboration of adequate instruments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.