989 resultados para Assignment Problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A utilização da margem de consignação é um fator indicativo de endividamento do servidor público, pois compromete parte de sua renda futura, diminuindo seu poder aquisitivo. A partir do ano de 2003 o governo brasileiro estimulou a oferta de crédito consignado, com taxas de juros mais atrativas, acarretando e um aumento do saldo de operações contratadas de 1.340,20% de 2004 a 2011. Devido as características do vínculo empregatício os servidores públicos se tornaram os principais tomadores de empréstimo consignado. Entretanto, o não entendimento da margem de consignação e a falta de educação financeira pode fazer com que o servidor entre em situação de dificuldade financeira, principalmente como sobreendividamento, que é a impossibilidade de pagar suas dívidas de créditos, podendo ter reflexo no desenvolvimento do trabalho. A fim de atingir o objetivo de identificar a atuação da instituição sobre as implicações do uso da margem de consignação pelos servidores públicos, foi realizado uma pesquisa de levantamento com abordagem quanti-qualitativa por meio de aplicação de questionário, obtendo um retorno de 210 respostas válidas. Os dados foram tabulados e analisados utilizando técnicas estatísticas como análise descritiva e tabelas de contingência observando os valores dos testes de qui-quadrado e V de Cramer. As conclusões indicam que a orientação financeira do servidor pode contribuir para utilização da margem de consignação de forma mais consciente evitando problemas financeiros e melhorando sua atuação no trabalho e que a instituição estudada poderia atuar de forma mais ativa sobre as implicações da margem de consignação. Por fim, foi recomendado como plano de intervenção da instituição a formulação de cursos sobre a educação financeira e sobre a margem de consignação, apresentado conceitos e casos cotidianos para melhor assimilação dos servidores, por meio do programa de capacitação dos servidores já existente na instituição

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The radial undistortion model proposed by Fitzgibbon and the radial fundamental matrix were early steps to extend classical epipolar geometry to distorted cameras. Later minimal solvers have been proposed to find relative pose and radial distortion, given point correspondences between images. However, a big drawback of all these approaches is that they require the distortion center to be exactly known. In this paper we show how the distortion center can be absorbed into a new radial fundamental matrix. This new formulation is much more practical in reality as it allows also digital zoom, cropped images and camera-lens systems where the distortion center does not exactly coincide with the image center. In particular we start from the setting where only one of the two images contains radial distortion, analyze the structure of the particular radial fundamental matrix and show that the technique also generalizes to other linear multi-view relationships like trifocal tensor and homography. For the new radial fundamental matrix we propose different estimation algorithms from 9,10 and 11 points. We show how to extract the epipoles and prove the practical applicability on several epipolar geometry image pairs with strong distortion that - to the best of our knowledge - no other existing algorithm can handle properly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1 – Summary of the decision taken by the Portuguese Constitutional Court, of January 13, 2011; 2 – Complete text of the decision of the Portuguese Constitutional Court, of January 13, 2011, Judge Maria João ANTUNES (Reporter), Judge Carlos Pamplona de OLIVEIRA, Judge José Borges SOEIRO, Judge Gil GALVÃO, Judge Rui Manuel Moura RAMOS (President) –in terms of the tribunalconstitucional.pt, August 1, 2011; 3 – Brief annotation to the problem of the “medical act”; 3.1 – Plus some conclusions on the brief annotation to the problem of the “medical act”; 3.2 – Brief annotation to the problem of “consent”– continuation of the previous comments; 4 – Conclusions. It must never be forgotten that “consent” does not stand as the only cause of exclusion of unlawfulness.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article was written by a Swiss-German historical demographer after having visited different Brazilian Universities in 1984 as a guest-professor. It aims at promoting a real dialog between developed and developing countries, commencing the discussion with the question: Can we learn from each other? An affirmative answer is given, but not in the superficial manner in which the discussion partners simply want to give each other some "good advice" or in which the one declares his country's own development to be the solely valid standard. Three points are emphasized: 1. Using infant mortality in S. Paulo from 1908 to 1983 as an example, it is shown that Brazil has at its disposal excellent, highly varied research literature that is unjustifiably unknown to us (in Europe) for the most part. Brazil by no means needs our tutoring lessons as regards the causal relationships; rather, we could learn two things from Brazil about this. For one, it becomes clear that our almost exclusively medical-biological view is inappropriate for passing a judgment on the present-day problems in Brazil and that any conclusions so derived are thus only transferable to a limited extent. For another, we need to reinterpret the history of infant mortality in our own countries up to the past few decades in a much more encompassing "Brazilian" sense. 2. A fruitful dialog can only take place if both partners frankly present their problems. For this reason, the article refers with much emprasis to our present problems in dealing with death and dying - problems arising near the end of the demographic and epidemiologic transitions: the superanuation of the population, chronic-incurable illnesses as the main causes of death, the manifold dependencies of more and more elderly and really old people at the end of a long life. Brazil seems to be catching up to us in this and will be confronted with these problems sooner or later. A far-sighted discussion already at this time seems thus to be useful. 3. The article, however, does not want to conclude with the rather depressing state of affairs of problems alternatingly superseding each other. Despite the caution which definitely has a place when prognoses are being made on the basis of extrapolations from historical findings, the foreseeable development especially of the epidemiologic transition in the direction of a rectangular survival curve does nevertheless provide good reason for being rather optimistic towards the future: first in regards to the development in our own countries, but then - assuming that the present similar tendencies of development are stuck to - also in regard to Brazil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Opposite enantiomers exhibit different NMR properties in the presence of an external common chiral element, and a chiral molecule exhibits different NMR properties in the presence of external enantiomeric chiral elements. Automatic prediction of such differences, and comparison with experimental values, leads to the assignment of the absolute configuration. Here two cases are reported, one using a dataset of 80 chiral secondary alcohols esterified with (R)-MTPA and the corresponding 1H NMR chemical shifts and the other with 94 13C NMR chemical shifts of chiral secondary alcohols in two enantiomeric chiral solvents. For the first application, counterpropagation neural networks were trained to predict the sign of the difference between chemical shifts of opposite stereoisomers. The neural networks were trained to process the chirality code of the alcohol as the input, and to give the NMR property as the output. In the second application, similar neural networks were employed, but the property to predict was the difference of chemical shifts in the two enantiomeric solvents. For independent test sets of 20 objects, 100% correct predictions were obtained in both applications concerning the sign of the chemical shifts differences. Additionally, with the second dataset, the difference of chemical shifts in the two enantiomeric solvents was quantitatively predicted, yielding r2 0.936 for the test set between the predicted and experimental values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thirty years ago, G.N. de Oliveira has proposed the following completion problems: Describe the possible characteristic polynomials of [C-ij], i,j is an element of {1, 2}, where C-1,C-1 and C-2,C-2 are square submatrices, when some of the blocks C-ij are fixed and the others vary. Several of these problems remain unsolved. This paper gives the solution, over the field of real numbers, of Oliveira's problem where the blocks C-1,C-1, C-2,C-2 are fixed and the others vary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

n this paper we make an exhaustive study of the fourth order linear operator u((4)) + M u coupled with the clamped beam conditions u(0) = u(1) = u'(0) = u'(1) = 0. We obtain the exact values on the real parameter M for which this operator satisfies an anti-maximum principle. Such a property is equivalent to the fact that the related Green's function is nonnegative in [0, 1] x [0, 1]. When M < 0 we obtain the best estimate by means of the spectral theory and for M > 0 we attain the optimal value by studying the oscillation properties of the solutions of the homogeneous equation u((4)) + M u = 0. By using the method of lower and upper solutions we deduce the existence of solutions for nonlinear problems coupled with this boundary conditions. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To determine the prevalence and severity of occlusal problems in populations at the ages of deciduous and permanent dentition and to carry out a meta-analysis to estimate the weighted odds ratio for occlusal problems comparing both groups. METHODS: Data of a probabilistic sample (n=985) of schoolchildren aged 5 and 12 from an epidemiological study in the municipality of São Paulo, Brazil, were analyzed using univariate logistic regression (MLR). Results of cross-sectional study data published in the last 70 years were examined in the meta-analysis. RESULTS: The prevalence of occlusal problems increased from 49.0% (95% CI =47.4%-50.6%) in the deciduous dentition to 71.3% (95% CI =70.3%-72.3%) in the permanent dentition (p<0.001). Dentition was the only variable significantly associated to the severity of malocclusion (OR=1.87; 95% CI =1.43-2.45; p<0.001). The variables sex, type of school and ethnic group were not significant. The meta-analysis showed that a weighted OR of 1.95 (1.91; 1.98) when compared the second dentition period with deciduous and mixed dentition. CONCLUSIONS: In planning oral health services, some activities are indicated to reduce the proportion of moderate/severe malocclusion to levels that are socially more acceptable and economically sustainable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado, Gestão de Empresa (MBA), 16 de Julho de 2013, Universidade dos Açores.