832 resultados para Best Approximation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In men with prior vasectomy, microsurgical reconstruction of the reproductive tract is more cost-effective than sperm retrieval with in vitro fertilization and intracytoplasmic sperm injection if the obstructive interval is less than 15 years and no female fertility risk factors are present. If epididymal obstruction is detected or advanced female age is present, the decision to use either microsurgical reconstruction or sperm retrieval with in vitro fertilization and intracytoplasmic sperm injection should be individualized. Sperm retrieval with in vitro fertilization and intracytoplasmic sperm injection is preferred to surgical treatment when female factors requiring in vitro fertilization are present or when the chance for success with sperm retrieval and intracytoplasmic sperm injection exceeds the chance for success with surgical treatment.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências Jurídicas (área de especialização em Ciências Jurídicas - Públicas)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Symbolic Aggregate Approximation (iSAX) is widely used in time series data mining. Its popularity arises from the fact that it largely reduces time series size, it is symbolic, allows lower bounding and is space efficient. However, it requires setting two parameters: the symbolic length and alphabet size, which limits the applicability of the technique. The optimal parameter values are highly application dependent. Typically, they are either set to a fixed value or experimentally probed for the best configuration. In this work we propose an approach to automatically estimate iSAX’s parameters. The approach – AutoiSAX – not only discovers the best parameter setting for each time series in the database, but also finds the alphabet size for each iSAX symbol within the same word. It is based on simple and intuitive ideas from time series complexity and statistics. The technique can be smoothly embedded in existing data mining tasks as an efficient sub-routine. We analyze its impact in visualization interpretability, classification accuracy and motif mining. Our contribution aims to make iSAX a more general approach as it evolves towards a parameter-free method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão Industrial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los materiales lignocelulósicos residuales de las actividades agroindustriales pueden ser aprovechados como fuente de lignina, hemicelulosa y celulosa. El tratamiento químico del material lignocelulósico se debe enfrentar al hecho de que dicho material es bastante recalcitrante a tal ataque, fundamentalmente debido a la presencia del polímero lignina. Esto se puede lograr también utilizando hongos de la podredumbre blanca de la madera. Estos producen enzimas lignolíticas extracelulares fundamentalmente Lacasa, que oxida la lignina a CO2. Tambien oxida un amplio rango de sustratos ( fenoles, polifenoles, anilinas, aril-diaminas, fenoles metoxi-sustituídos, y otros), lo cual es una buena razón de su atracción para aplicaciones biotecnológicas. La enzima tiene potencial aplicación en procesos tales como en la delignificación de materiales lignocelulósicos y en el bioblanqueado de pulpas para papel, en el tratamiento de aguas residuales de plantas industriales, en la modificación de fibras y decoloración en industrias textiles y de colorantes, en el mejoramiento de alimentos para animales, en la detoxificación de polutantes y en bioremediación de suelos contaminados. También se la ha utilizado en Q.Orgánica para la oxidación de grupos funcionales, en la formación de enlaces carbono- nitrógeno y en la síntesis de productos naturales complejos. HIPOTESIS: Los hongos de podredumbre blanca, y en condiciones óptimas de cultivo producen distintos tipos de enzimas oxidasas, siendo las lacasas las más adecuadas para explorarlas como catalizadores en los siguientes procesos:  Delignificación de residuos de la industria forestal con el fin de aprovechar tales desechos en la alimentación animal.  Decontaminación/remediación de suelos y/o efluentes industriales. Se realizarán los estudios para el diseño de bio-reactores que permitan responder a las dos cuestiones planteadas en la hipótesis. Para el proceso de delignificación de material lignocelulósico se proponen dos estrategias: 1- tratar el material con el micelio del hongo adecuando la provisión de nutrientes para un desarrollo sostenido y favorecer la liberación de la enzima. 2- Utilizar la enzima lacasa parcialmente purificada acoplada a un sistema mediador para oxidar los compuestos polifenólicos. Para el proceso de decontaminación/remediación de suelos y/o efluentes industriales se trabajará también en dos frentes: 3) por un lado, se ha descripto que existe una correlación positiva entre la actividad de algunas enzimas presentes en el suelo y la fertilidad. En este sentido se conoce que un sistema enzimático, tentativamente identificado como una lacasa de origen microbiano es responsable de la transformación de compuestos orgánicos en el suelo. La enzima protege al suelo de la acumulación de compuestos orgánicos peligrosos catalizando reacciones que involucran degradación, polimerización e incorporación a complejos del ácido húmico. Se utilizarán suelos incorporados con distintos polutantes(por ej. policlorofenoles ó cloroanilinas.) 4) Se trabajará con efluentes industriales contaminantes (alpechínes y/o el efluente líquido del proceso de desamargado de las aceitunas). The lignocellulosic raw materials of the agroindustrial activities can be taken advantage as source of lignin, hemicellulose and cellulose. The chemical treatment of this material is not easy because the above mentioned material is recalcitrant enough to such an assault, due to the presence of the lignin. This can be achieved also using the white-rot fungi of the wood. It produces extracellular ligninolitic enzymes, fundamentally Laccase, which oxidizes the lignin to CO2. The enzyme has application in such processes as in the delignification of lignocellulosic materials and in the biobleaching of fibers for paper industry, in the treatment of waste water of industrial plants, in the discoloration in textile industries, in the improvement of food for ruminants, in the detoxification of polutants and in bioremediation of contaminated soils. HYPOTHESIS: The white-rot fungi produce different types of enzymes, being the laccases the most adapted to explore them as catalysts in the following processes:  Delignification of residues of the forest industry in order to take advantage of such waste in the animal feed.  Decontamination of soils and / or waste waters. The studies will be conducted for the design of bio reactors that allow to answer to both questions raised in the hypothesis. For the delignification process of lignocellulosic material they propose two strategies: 1- to treat the material with the fungi 2-to use the partially purified enzyme to oxidize the polyphenolic compounds. For the soil and/or waste water decontamination process, we have: 3- Is know that the enzyme protects to the soil of the accumulation of organic dangerous compounds catalyzing reactions that involve degradation, polymerization and incorporation to complexes of the humic acid. There will be use soils incorporated into different pollutants. 4- We will work with waste waters (alpechins or the green olive debittering effluents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Navier-Stokes-Gleichungen, Gleitrandbedingung, Konvektions-Diffusions-Gleichung, Finite-Elemente-Methode, Mehrgitterverfahren, Fehlerabschätzung, Iterative Entkopplung

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Dual antiplatelet therapy is a well-established treatment in patients with non-ST elevation acute coronary syndrome (NSTE-ACS), with class I of recommendation (level of evidence A) in current national and international guidelines. Nonetheless, these guidelines are not precise or consensual regarding the best time to start the second antiplatelet agent. The evidences are conflicting, and after more than a decade using clopidogrel in this scenario, benefits from the routine pretreatment, i.e. without knowing the coronary anatomy, with dual antiplatelet therapy remain uncertain. The recommendation for the upfront treatment with clopidogrel in NSTE-ACS is based on the reduction of non-fatal events in studies that used the conservative strategy with eventual invasive stratification, after many days of the acute event. This approach is different from the current management of these patients, considering the established benefits from the early invasive strategy, especially in moderate to high-risk patients. The only randomized study to date that specifically tested the pretreatment in NSTE-ACS in the context of early invasive strategy, used prasugrel, and it did not show any benefit in reducing ischemic events with pretreatment. On the contrary, its administration increased the risk of bleeding events. This study has brought the pretreatment again into discussion, and led to changes in recent guidelines of the American and European cardiology societies. In this paper, the authors review the main evidence of the pretreatment with dual antiplatelet therapy in NSTE-ACS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, usability testing in the development of software and systems is essential. A stationary usability lab offers many different possibilities in the evaluation of usability, but it reaches its limits in terms of flexibility and the experimental conditions. Mobile usability studies consider consciously outside influences, and these studies require a specially adapted approach to preparation, implementation and evaluation. Using the example of a mobile eye tracking study the difficulties and the opportunities of mobile testing are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparative analysis of continuous signals restoration by different kinds of approximation is performed. The software product, allowing to define optimal method of different original signals restoration by Lagrange polynomial, Kotelnikov interpolation series, linear and cubic splines, Haar wavelet and Kotelnikov-Shannon wavelet based on criterion of minimum value of mean-square deviation is proposed. Practical recommendations on the selection of approximation function for different class of signals are obtained.