934 resultados para Millionaire Problem, Efficiency, Verifiability, Zero Test, Batch Equation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Container Loading Problem (CLP) literature has traditionally evaluated the dynamic stability of cargo by applying two metrics to box arrangements: the mean number of boxes supporting the items excluding those placed directly on the floor (M1) and the percentage of boxes with insufficient lateral support (M2). However, these metrics, that aim to be proxies for cargo stability during transportation, fail to translate real-world cargo conditions of dynamic stability. In this paper two new performance indicators are proposed to evaluate the dynamic stability of cargo arrangements: the number of fallen boxes (NFB) and the number of boxes within the Damage Boundary Curve fragility test (NB_DBC). Using 1500 solutions for well-known problem instances found in the literature, these new performance indicators are evaluated using a physics simulation tool (StableCargo), replacing the real-world transportation by a truck with a simulation of the dynamic behaviour of container loading arrangements. Two new dynamic stability metrics that can be integrated within any container loading algorithm are also proposed. The metrics are analytical models of the proposed stability performance indicators, computed by multiple linear regression. Pearson’s r correlation coefficient was used as an evaluation parameter for the performance of the models. The extensive computational results show that the proposed metrics are better proxies for dynamic stability in the CLP than the previous widely used metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades, small strain shear modulus became one of the most important geotechnical parameters to characterize soil stiffness. Finite element analysis have shown that in-situ stiffness of soils and rocks is much higher than what was previously thought and that stress-strain behaviour of these materials is non-linear in most cases with small strain levels, especially in the ground around retaining walls, foundations and tunnels, typically in the order of 10−2 to 10−4 of strain. Although the best approach to estimate shear modulus seems to be based in measuring seismic wave velocities, deriving the parameter through correlations with in-situ tests is usually considered very useful for design practice.The use of Neural Networks for modeling systems has been widespread, in particular within areas where the great amount of available data and the complexity of the systems keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks and Support Vector Regression is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test. The results are discussed and compared with some of the most common available methodologies for this evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades, small strain shear modulus became one of the most important geotechnical parameters to characterize soil stiffness. Finite element analysis have shown that in-situ stiffness of soils and rocks is much higher than what was previously thought and that stress-strain behaviour of these materials is non-linear in most cases with small strain levels, especially in the ground around retaining walls, foundations and tunnels, typically in the order of 10−2 to 10−4 of strain. Although the best approach to estimate shear modulus seems to be based in measuring seismic wave velocities, deriving the parameter through correlations with in-situ tests is usually considered very useful for design practice.The use of Neural Networks for modeling systems has been widespread, in particular within areas where the great amount of available data and the complexity of the systems keeps the problem very unfriendly to treat following traditional data analysis methodologies. In this work, the use of Neural Networks and Support Vector Regression is proposed to estimate small strain shear modulus for sedimentary soils from the basic or intermediate parameters derived from Marchetti Dilatometer Test. The results are discussed and compared with some of the most common available methodologies for this evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leptospira spp. are delicate bacteria that cannot be studied by usual microbiological methods. They cause leptospirosis, a zoonotic disease transmitted to humans through infected urine of wild or domestic animals. We studied the incidence of this disease in the Uruguayan population, its epidemiologic and clinical features, and compared diagnostic techniques. After examining 6,778 suspect cases, we estimated that about 15 infections/100,000 inhabitants occurred yearly, affecting mainly young male rural workers. Awareness about leptospirosis has grown among health professionals, and its lethality has consequently decreased. Bovine infections were probably the principal source of human disease. Rainfall volumes and floods were major factors of varying incidence. Most patients had fever, asthenia, myalgias or cephalalgia, with at least one additional abnormal clinical feature. 30-40% of confirmed cases presented abdominal signs and symptoms, conjunctival suffusion and altered renal or urinary function. Jaundice was more frequent in patients aged > 40 years. Clinical infections followed an acute pattern and their usual outcome was complete recovery. Laboratory diagnosis was based on indirect micro-agglutination standard technique (MAT). Second serum samples were difficult to obtain, often impairing completion of diagnosis. Immunofluorescence was useful as a screening test and for early detection of probable infections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Lógica Computacional

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese apresentada como requisito parcial para obtenção do grau de Doutor em Gestão de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The catastrophic disruption in the USA financial system in the wake of the financial crisis prompted the Federal Reserve to launch a Quantitative Easing (QE) programme in late 2008. In line with Pesaran and Smith (2014), I use a policy effectiveness test to assess whether this massive asset purchase programme was effective in stimulating the economic activity in the USA. Specifically, I employ an Autoregressive Distributed Lag Model (ARDL), in order to obtain a counterfactual for the USA real GDP growth rate. Using data from 1983Q1 to 2009Q4, the results show that the beneficial effects of QE appear to be weak and rather short-lived. The null hypothesis of policy ineffectiveness is not rejected, which suggests that QE did not have a meaningful impact on output growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recycling of pavements is nowadays a very important question to the road paving industry. With the objective of incorporating higher percentages of reclaimed asphalt (RA) materials in recycled asphalt mixtures, new techniques have been developed in the last years. The use of foamed bitumen is normally associated with the production of cold asphalt mixtures, which usually show lower quality standards. However, the objective of the work presented in this paper is to assess the use of foamed bitumen as the binder of warm asphalt mixtures incorporating 30% RA, which have quality standards similar to those of conventional mixtures. Thus, five mixtures have been produced with 30% RA, one of them with a conventional bitumen (control mix) and the others with foamed bitumen at different production temperatures. The mixtures were tested for compactability and water sensitivity and the results show a possible reduction of 25 ºC in the production temperatures, while the water sensitivity test results were kept close to 90 %.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article revisits Michel Chevalier’s work and discussions of tariffs. Chevalier shifted from Saint-Simonism to economic liberalism during his life in the 19th century. His influence was soon perceived in the political world and economic debates, mainly because of his discussion of tariffs as instruments of efficient transport policies. This work discusses Chevalier’s thoughts on tariffs by revisiting his masterpiece, Le Cours d’Économie Politique. Data Envelopment Analysis (DEA) was conducted to test Chevalier’s hypothesis on the inefficiency of French tariffs. This work showed that Chevalier’s claims on French tariffs are not validated by DEA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia Química e Biológica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de International Master in Sustainable Built Environment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The 6-minute walk test is an way of assessing exercise capacity and predicting survival in heart failure. The 6-minute walk test was suggested to be similar to that of daily activities. We investigated the effect of motivation during the 6-minute walk test in heart failure. METHODS: We studied 12 males, age 45±12 years, ejection fraction 23±7%, and functional class III. Patients underwent the following tests: maximal cardiopulmonary exercise test on the treadmill (max), cardiopulmonary 6-minute walk test with the walking rhythm maintained between relatively easy and slightly tiring (levels 11 and 13 on the Borg scale) (6EB), and cardiopulmonary 6-minute walk test using the usual recommendations (6RU). The 6EB and 6RU tests were performed on a treadmill with zero inclination and control of the velocity by the patient. RESULTS: The values obtained in the max, 6EB, and 6RU tests were, respectively, as follows: O2 consumption (ml.kg-1.min-1) 15.4±1.8, 9.8±1.9 (60±10%), and 13.3±2.2 (90±10%); heart rate (bpm) 142±12, 110±13 (77±9%), and 126±11 (89±7%); distance walked (m) 733±147, 332±66, and 470±48; and respiratory exchange ratio (R) 1.13±0.06, 0.9±0.06, and 1.06±0.12. Significant differences were observed in the values of the variables cited between the max and 6EB tests, the max and 6RU tests, and the 6EB and 6RU tests (p<0.05). CONCLUSION: Patients, who undergo the cardiopulmonary 6-minute walk test and are motivated to walk as much as they possibly can, usually walk almost to their maximum capacity, which may not correspond to that of their daily activities. The use of the Borg scale during the cardiopulmonary 6-minute walk test seems to better correspond to the metabolic demand of the usual activities in this group of patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia Civil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.