980 resultados para Prescribed mean-curvature problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plakhov, A.Y.; Gouveia, P.D.F., (2007) 'Problems of maximal mean resistance on the plane', Nonlinearity 20(9) pp.2271-2287 RAE2008

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we revisit the problem of the hedging of contingent claim using mean-square criterion. We prove that in incomplete market, some probability measure can be identified so that becomes -martingale under .This is in fact a new proposition on the martingale representation theorem. The new results also identify a weight function that serves to be an approximation to the Radon-Nikodým derivative of the unique neutral martingale measure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 90° problem of cosmic-ray transport theory is revisited in this paper. By using standard forms of the wave spectrum in the solar wind, the pitch-angle Fokker–Planck coefficient and the parallel mean free path are computed for different resonance functions. A critical comparison is made of the strength of 90° scattering due to plasmawave effects, dynamical turbulence effects and nonlinear effects. It is demonstrated that, only for low-energy cosmic particles, dynamical effects are usually dominant. The novel results presented here are essential for an effective comparison of heliospheric observations for the parallel mean free path with the theoretical model results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of forest floor ground cover and litter layers by prescribed fires may alter the morphology (field and micro) and physical properties of surface horizons. This study determined long-term (35 yr) changes in surface horizon bulk density, organic matter concentration and content, and morphology in response to periodic (5 yr) and annual (1 yr) prescribed fires. Soils were fine-silty, siliceous, thermic Glossic Fragiuldults, supporting mixed oak vegetation in middle Tennessee. Upper mineral soils (0- to 2-cm and 0- to 7.6-cm depths) were sampled and detailed field descriptions made. Periodic and control plots had a thin layer of Oi, Oe, and Oa horizons 5 yr after the 1993 burn, whereas on annual burn plots a 1- to 2-cm charred layer was present. Significant reductions in organic matter concentration and mean thickness of the A horizon were found from burning (A horizons thicknesses were 6.4, 4.6, and 2.9 cm in control, periodic, and annual plots, respectively). Periodic burns did not significantly alter the organic matter and bulk density of the upper 7.6 cm of mineral soil; however, annual burns did result in significantly higher bulk densities (1.01, 1.07, and 1.29 Mg m-3 in control, periodic, and annual plots, respectively) and lower organic matter concentrations and contents. Microscopic investigations confirmed that compaction was increased from annual burning. Thin sections also revealed that the granular structure of the A horizons in control and periodic plots resulted from bioterbation of macro and mesofauna, fungi, and roots. Long-term annual burning greatly affected surface soil properties, whereas periodic burning on a 5-yr cycle had only limited effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the problem of tracking similar objects. We show how a mean field approach can be used to deal with interacting targets and we compare it with Markov Chain Monte Carlo (MCMC). Two mean field implementations are presented. The first one is more general and uses particle filtering. We discuss some simplifications of the base algorithm that reduce the computation time. The second one is based on suitable Gaussian approximations of probability densities that lead to a set of self-consistent equations for the means and covariances. These equations give the Kalman solution if there is no interaction. Experiments have been performed on two kinds of sequences. The first kind is composed of a single long sequence of twenty roaming ants and was previously analysed using MCMC. In this case, our mean field algorithms obtain substantially better results. The second kind corresponds to selected sequences of a football match in which the interaction avoids tracker coalescence in situations where independent trackers fail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digoxin is one of the most frequently prescribed drugs, particularly in the elderly population where there is an increased prevalence of atrial fibrillation and cardiac failure. The drug has a narrow therapeutic range and has gained a reputation for producing adverse effects in older patients. The more frail elderly patients with coexistent disease, often taking other treatments, are more at risk from digoxin toxicity due to inappropriate dosing, noncompliance, or increased sensitivity to digoxin resulting from pharmacokinetic or pharmacodynamic interactions. Application of basic pharmacological principles may be helpful in anticipating these problems. Elderly patients more commonly receive digoxin than younger patients, which in part accounts for the higher rates of toxicity in this group. Numerous components contribute to the development of toxicity, and diagnosis of toxicity is difficult in this age group. The measurement of serum concentrations can contribute to the clinical diagnosis. A major problem is the accurate diagnosis of digoxin toxicity which may have numerous nonspecific clinical manifestations, many of which are related to coexisting disease in elderly patients. This diagnostic imprecision is well recognised but has been helped by the introduction of serum digoxin measurement. However, reliance on serum concentrations should not replace clinical judgement, since these do not always correlate with toxicity. The apparently decreasing incidence of toxicity over recent years probably reflects several factors: the improvement in digoxin formulations, awareness of digoxin pharmacology, utilisation of serum concentrations, and the realisation that digoxin withdrawal is a viable proposition in elderly patients. Greater knowledge about the causes and prevention of digoxin toxicity should further reduce the morbidity and mortality arising from digoxin overdose, especially in the elderly population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life science research aims to continuously improve the quality and standard of human life. One of the major challenges in this area is to maintain food safety and security. A number of image processing techniques have been used to investigate the quality of food products. In this paper,we propose a new algorithm to effectively segment connected grains so that each of them can be inspected in a later processing stage. One family of the existing segmentation methods is based on the idea of watersheding, and it has shown promising results in practice.However,due to the over-segmentation issue,this technique has experienced poor performance in various applications,such as inhomogeneous background and connected targets. To solve this problem,we present a combination of two classical techniques to handle this issue.In the first step,a mean shift filter is used to eliminate the inhomogeneous background, where entropy is used to be a converging criterion. Secondly,a color gradient algorithm is used in order to detect the most significant edges, and a marked watershed transform is applied to segment cluttered objects out of the previous processing stages. The proposed framework is capable of compromising among execution time, usability, efficiency and segmentation outcome in analyzing ring die pellets. The experimental results demonstrate that the proposed approach is effectiveness and robust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blood culture contamination (BCC) has been associated with unnecessary antibiotic use, additional laboratory tests and increased length of hospital stay thus incurring significant extra hospital costs. We set out to assess the impact of a staff educational intervention programme on decreasing intensive care unit (ICU) BCC rates to <3% (American Society for Microbiology standard). BCC rates during the pre-intervention period (January 2006-May 2011) were compared with the intervention period (June 2011-December 2012) using run chart and regression analysis. Monthly ICU BCC rates during the intervention period were reduced to a mean of 3·7%, compared to 9·5% during the baseline period (P < 0·001) with an estimated potential annual cost savings of about £250 100. The approach used was simple in design, flexible in delivery and efficient in outcomes, and may encourage its translation into clinical practice in different healthcare settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new heuristic based on Nawaz–Enscore–Ham (NEH) algorithm is proposed for solving permutation flowshop scheduling problem in this paper. A new priority rule is proposed by accounting for the average, mean absolute deviation, skewness and kurtosis, in order to fully describe the distribution style of processing times. A new tie-breaking rule is also introduced for achieving effective job insertion for the objective of minimizing both makespan and machine idle-time. Statistical tests illustrate better solution quality of the proposed algorithm, comparing to existing benchmark heuristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the calculation of fractional order expressions through rational fractions. The article starts by analyzing the techniques adopted in the continuous to discrete time conversion. The problem is re-evaluated in an optimization perspective by tacking advantage of the degree of freedom provided by the generalized mean formula. The results demonstrate the superior performance of the new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Container Loading Problem (CLP) literature has traditionally evaluated the dynamic stability of cargo by applying two metrics to box arrangements: the mean number of boxes supporting the items excluding those placed directly on the floor (M1) and the percentage of boxes with insufficient lateral support (M2). However, these metrics, that aim to be proxies for cargo stability during transportation, fail to translate real-world cargo conditions of dynamic stability. In this paper two new performance indicators are proposed to evaluate the dynamic stability of cargo arrangements: the number of fallen boxes (NFB) and the number of boxes within the Damage Boundary Curve fragility test (NB_DBC). Using 1500 solutions for well-known problem instances found in the literature, these new performance indicators are evaluated using a physics simulation tool (StableCargo), replacing the real-world transportation by a truck with a simulation of the dynamic behaviour of container loading arrangements. Two new dynamic stability metrics that can be integrated within any container loading algorithm are also proposed. The metrics are analytical models of the proposed stability performance indicators, computed by multiple linear regression. Pearson’s r correlation coefficient was used as an evaluation parameter for the performance of the models. The extensive computational results show that the proposed metrics are better proxies for dynamic stability in the CLP than the previous widely used metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we study the effect of uncertainty on an entrepreneur who must choose the capacity of his business before knowing the demand for his product. The unit profit of operation is known with certainty but there is no flexibility in our one-period framework. We show how the introduction of global uncertainty reduces the investment of the risk neutral entrepreneur and, even more, that the risk averse one. We also show how marginal increases in risk reduce the optimal capacity of both the risk neutral and the risk averse entrepreneur, without any restriction on the concave utility function and with limited restrictions on the definition of a mean preserving spread. These general results are explained by the fact that the newsboy has a piecewise-linear, and concave, monetary payoff witha kink endogenously determined at the level of optimal capacity. Our results are compared with those in the two literatures on price uncertainty and demand uncertainty, and particularly, with the recent contributions of Eeckhoudt, Gollier and Schlesinger (1991, 1995).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we study the effect of uncertainty on an entrepreneur who must choose the capacity of his business before knowing the demand for his product. The unit profit of operation is known with certainty but there is no flexibility in our one-period framework. We show how the introduction of global uncertainty reduces the investment of the risk neutral entrepreneur and, even more, that the risk averse one. We also show how marginal increases in risk reduce the optimal capacity of both the risk neutral and the risk averse entrepreneur, without any restriction on the concave utility function and with limited restrictions on the definition of a mean preserving spread. These general results are explained by the fact that the newsboy has a piecewise-linear, and concave, monetary payoff witha kink endogenously determined at the level of optimal capacity. Our results are compared with those in the two literatures on price uncertainty and demand uncertainty, and particularly, with the recent contributions of Eeckhoudt, Gollier and Schlesinger (1991, 1995).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Biology is the research are that contributes to the analysis of biological data through the development of algorithms which will address significant research problems.The data from molecular biology includes DNA,RNA ,Protein and Gene expression data.Gene Expression Data provides the expression level of genes under different conditions.Gene expression is the process of transcribing the DNA sequence of a gene into mRNA sequences which in turn are later translated into proteins.The number of copies of mRNA produced is called the expression level of a gene.Gene expression data is organized in the form of a matrix. Rows in the matrix represent genes and columns in the matrix represent experimental conditions.Experimental conditions can be different tissue types or time points.Entries in the gene expression matrix are real values.Through the analysis of gene expression data it is possible to determine the behavioral patterns of genes such as similarity of their behavior,nature of their interaction,their respective contribution to the same pathways and so on. Similar expression patterns are exhibited by the genes participating in the same biological process.These patterns have immense relevance and application in bioinformatics and clinical research.Theses patterns are used in the medical domain for aid in more accurate diagnosis,prognosis,treatment planning.drug discovery and protein network analysis.To identify various patterns from gene expression data,data mining techniques are essential.Clustering is an important data mining technique for the analysis of gene expression data.To overcome the problems associated with clustering,biclustering is introduced.Biclustering refers to simultaneous clustering of both rows and columns of a data matrix. Clustering is a global whereas biclustering is a local model.Discovering local expression patterns is essential for identfying many genetic pathways that are not apparent otherwise.It is therefore necessary to move beyond the clustering paradigm towards developing approaches which are capable of discovering local patterns in gene expression data.A biclusters is a submatrix of the gene expression data matrix.The rows and columns in the submatrix need not be contiguous as in the gene expression data matrix.Biclusters are not disjoint.Computation of biclusters is costly because one will have to consider all the combinations of columans and rows in order to find out all the biclusters.The search space for the biclustering problem is 2 m+n where m and n are the number of genes and conditions respectively.Usually m+n is more than 3000.The biclustering problem is NP-hard.Biclustering is a powerful analytical tool for the biologist.The research reported in this thesis addresses the problem of biclustering.Ten algorithms are developed for the identification of coherent biclusters from gene expression data.All these algorithms are making use of a measure called mean squared residue to search for biclusters.The objective here is to identify the biclusters of maximum size with the mean squared residue lower than a given threshold. All these algorithms begin the search from tightly coregulated submatrices called the seeds.These seeds are generated by K-Means clustering algorithm.The algorithms developed can be classified as constraint based,greedy and metaheuristic.Constarint based algorithms uses one or more of the various constaints namely the MSR threshold and the MSR difference threshold.The greedy approach makes a locally optimal choice at each stage with the objective of finding the global optimum.In metaheuristic approaches particle Swarm Optimization(PSO) and variants of Greedy Randomized Adaptive Search Procedure(GRASP) are used for the identification of biclusters.These algorithms are implemented on the Yeast and Lymphoma datasets.Biologically relevant and statistically significant biclusters are identified by all these algorithms which are validated by Gene Ontology database.All these algorithms are compared with some other biclustering algorithms.Algorithms developed in this work overcome some of the problems associated with the already existing algorithms.With the help of some of the algorithms which are developed in this work biclusters with very high row variance,which is higher than the row variance of any other algorithm using mean squared residue, are identified from both Yeast and Lymphoma data sets.Such biclusters which make significant change in the expression level are highly relevant biologically.