995 resultados para linear complexity
Resumo:
AbstractBackground:Heart surgery has developed with increasing patient complexity.Objective:To assess the use of resources and real costs stratified by risk factors of patients submitted to surgical cardiac procedures and to compare them with the values reimbursed by the Brazilian Unified Health System (SUS).Method:All cardiac surgery procedures performed between January and July 2013 in a tertiary referral center were analyzed. Demographic and clinical data allowed the calculation of the value reimbursed by the Brazilian SUS. Patients were stratified as low, intermediate and high-risk categories according to the EuroSCORE. Clinical outcomes, use of resources and costs (real costs versus SUS) were compared between established risk groups.Results:Postoperative mortality rates of low, intermediate and high-risk EuroSCORE risk strata showed a significant linear positive correlation (EuroSCORE: 3.8%, 10%, and 25%; p < 0.0001), as well as occurrence of any postoperative complication EuroSCORE: 13.7%, 20.7%, and 30.8%, respectively; p = 0.006). Accordingly, length-of-stay increased from 20.9 days to 24.8 and 29.2 days (p < 0.001). The real cost was parallel to increased resource use according to EuroSCORE risk strata (R$ 27.116,00 ± R$ 13.928,00 versus R$ 34.854,00 ± R$ 27.814,00 versus R$ 43.234,00 ± R$ 26.009,00, respectively; p < 0.001). SUS reimbursement also increased (R$ 14.306,00 ± R$ 4.571,00 versus R$ 16.217,00 ± R$ 7.298,00 versus R$ 19.548,00 ± R$935,00; p < 0.001). However, as the EuroSCORE increased, there was significant difference (p < 0.0001) between the real cost increasing slope and the SUS reimbursement elevation per EuroSCORE risk strata.Conclusion:Higher EuroSCORE was related to higher postoperative mortality, complications, length of stay, and costs. Although SUS reimbursement increased according to risk, it was not proportional to real costs.
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2012
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Diss., 2013
Resumo:
Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Univ., Dissertation, 2015
Resumo:
We consider linear stochastic differential-algebraic equations with constant coefficients and additive white noise. Due to the nature of this class of equations, the solution must be defined as a generalised process (in the sense of Dawson and Fernique). We provide sufficient conditions for the law of the variables of the solution process to be absolutely continuous with respect to Lebesgue measure.
Resumo:
We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.
Resumo:
We consider multidimensional backward stochastic differential equations (BSDEs). We prove the existence and uniqueness of solutions when the coefficient grow super-linearly, and moreover, can be neither locally Lipschitz in the variable y nor in the variable z. This is done with super-linear growth coefficient and a p-integrable terminal condition (p & 1). As application, we establish the existence and uniqueness of solutions to degenerate semilinear PDEs with superlinear growth generator and an Lp-terminal data, p & 1. Our result cover, for instance, the case of PDEs with logarithmic nonlinearities.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We extend Floquet theory for reducing nonlinear periodic difference systems to autonomous ones (actually linear) by using normal form theory.
Resumo:
The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.
Resumo:
Based on Lucas functions, an improved version of the Diffie-Hellman distribution key scheme and to the ElGamal public key cryptosystem scheme are proposed, together with an implementation and computational cost. The security relies on the difficulty of factoring an RSA integer and on the difficulty of computing the discrete logarithm.
Resumo:
Studies evaluating the mechanical behavior of the trabecular microstructure play an important role in our understanding of pathologies such as osteoporosis, and in increasing our understanding of bone fracture and bone adaptation. Understanding of such behavior in bone is important for predicting and providing early treatment of fractures. The objective of this study is to present a numerical model for studying the initiation and accumulation of trabecular bone microdamage in both the pre- and post-yield regions. A sub-region of human vertebral trabecular bone was analyzed using a uniformly loaded anatomically accurate microstructural three-dimensional finite element model. The evolution of trabecular bone microdamage was governed using a non-linear, modulus reduction, perfect damage approach derived from a generalized plasticity stress-strain law. The model introduced in this paper establishes a history of microdamage evolution in both the pre- and post-yield regions