972 resultados para Strictly hyperbolic polynomial


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, an enriched radial point interpolation method (e-RPIM) is developed the for the determination of crack tip fields. In e-RPIM, the conventional RBF interpolation is novelly augmented by the suitable trigonometric basis functions to reflect the properties of stresses for the crack tip fields. The performance of the enriched RBF meshfree shape functions is firstly investigated to fit different surfaces. The surface fitting results have proven that, comparing with the conventional RBF shape function, the enriched RBF shape function has: (1) a similar accuracy to fit a polynomial surface; (2) a much better accuracy to fit a trigonometric surface; and (3) a similar interpolation stability without increase of the condition number of the RBF interpolation matrix. Therefore, it has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF shape function, but also can accurately reflect the properties of stresses for the crack tip fields. The system of equations for the crack analysis is then derived based on the enriched RBF meshfree shape function and the meshfree weak-form. Several problems of linear fracture mechanics are simulated using this newlydeveloped e-RPIM method. It has demonstrated that the present e-RPIM is very accurate and stable, and it has a good potential to develop a practical simulation tool for fracture mechanics problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although interests in assessing the relationship between temperature and mortality have arisen due to climate change, relatively few data are available on lag structure of temperature-mortality relationship, particularly in the Southern Hemisphere. This study identified the lag effects of mean temperature on mortality among age groups and death categories using polynomial distributed lag models in Brisbane, Australia, a subtropical city, 1996-2004. For a 1 °C increase above the threshold, the highest percent increase in mortality on the current day occurred among people over 85 years (7.2% (95% CI: 4.3%, 10.2%)). The effect estimates among cardiovascular deaths were higher than those among all-cause mortality. For a 1 °C decrease below the threshold, the percent increases in mortality at 21 lag days were 3.9% (95% CI: 1.9%, 6.0%) and 3.4% (95% CI: 0.9%, 6.0%) for people aged over 85 years and with cardiovascular diseases, respectively. These findings may have implications for developing intervention strategies to reduce and prevent temperature-related mortality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Micro-finance, which includes micro-credit as one of its core services, has become an important component of a range of business models – from those that operate on a strictly economic basis to those that come from a philanthropic base, through Non Government Organisations (NGOs). Its success is often measured by the number of loans issued, their size, and the repayment rates. This paper has a dual purpose: to identify whether the models currently used to deliver micro-credit services to the poor are socially responsible and to suggest a new model of delivery that addresses some of the social responsibility issues, while supporting community development. The proposed model is currently being implemented in Beira, the second largest city in Mozambique. Mozambique exhibits many of the characteristics found in other African countries so the model, if successful, may have implications for other poor African nations as well as other developing economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To quantify the lagged effects of mean temperature on deaths from cardiovascular diseases in Brisbane, Australia. Design Polynomial distributed lag models were used to assess the percentage increase in mortality up to 30 days associated with an increase (or decrease) of 1°C above (or below) the threshold temperature. Setting Brisbane, Australia. Patients 22 805 cardiovascular deaths registered between 1996 and 2004. Main outcome measures Deaths from cardiovascular diseases. Results The results show a longer lagged effect in cold days and a shorter lagged effect in hot days. For the hot effect, a statistically significant association was observed only for lag 0–1 days. The percentage increase in mortality was found to be 3.7% (95% CI 0.4% to 7.1%) for people aged ≥65 years and 3.5% (95% CI 0.4% to 6.7%) for all ages associated with an increase of 1°C above the threshold temperature of 24°C. For the cold effect, a significant effect of temperature was found for 10–15 lag days. The percentage estimates for older people and all ages were 3.1% (95% CI 0.7% to 5.7%) and 2.8% (95% CI 0.5% to 5.1%), respectively, with a decrease of 1°C below the threshold temperature of 24°C. Conclusions The lagged effects lasted longer for cold temperatures but were apparently shorter for hot temperatures. There was no substantial difference in the lag effect of temperature on mortality between all ages and those aged ≥65 years in Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a widely held view that architecture is very strongly and even primarily determined by the society and culture rather than geographical, technical, economic or cli-matic factors. This is especially evident in societies where rituals, customs and tradition play a significant role in the design of built forms. One such society was that of Feudal Japan under the rule of samurai warriors. The strictly controlled hierarchical society of Feudal Japan, isolated from the rest of the world for over 250 years, was able to develop the art and architecture borrowed from neighboring older cultures of China and Korea into what is now considered uniquely Japanese. One such architecture is the Sukiya style tea houses where the ritual of tea ceremony took place. This ritual was developed by the tea masters who were Zen monks or the merchants who belonged to the lowest class in the hierarchical feudal society. The Sukiya style developed from 14th to 16th century and became an architectural space that negated all the rules imposed on commoners by the samurai rulers. The tea culture had a major influence on Japanese architecture, the concept of space and aesthetics. It extended into the design of Japanese gardens, clothes, presentation of food, and their manners in day to day life. The focus of this paper is the Japanese ritual of tea ceremony, the architecture of the tea house it inspired, the society responsible for its creation and the culture that promoted its popularity and its continuation into the 21st century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The majority of peptide bonds in proteins are found to occur in the trans conformation. However, for proline residues, a considerable fraction of Prolyl peptide bonds adopt the cis form. Proline cis/trans isomerization is known to play a critical role in protein folding, splicing, cell signaling and transmembrane active transport. Accurate prediction of proline cis/trans isomerization in proteins would have many important applications towards the understanding of protein structure and function. Results In this paper, we propose a new approach to predict the proline cis/trans isomerization in proteins using support vector machine (SVM). The preliminary results indicated that using Radial Basis Function (RBF) kernels could lead to better prediction performance than that of polynomial and linear kernel functions. We used single sequence information of different local window sizes, amino acid compositions of different local sequences, multiple sequence alignment obtained from PSI-BLAST and the secondary structure information predicted by PSIPRED. We explored these different sequence encoding schemes in order to investigate their effects on the prediction performance. The training and testing of this approach was performed on a newly enlarged dataset of 2424 non-homologous proteins determined by X-Ray diffraction method using 5-fold cross-validation. Selecting the window size 11 provided the best performance for determining the proline cis/trans isomerization based on the single amino acid sequence. It was found that using multiple sequence alignments in the form of PSI-BLAST profiles could significantly improve the prediction performance, the prediction accuracy increased from 62.8% with single sequence to 69.8% and Matthews Correlation Coefficient (MCC) improved from 0.26 with single local sequence to 0.40. Furthermore, if coupled with the predicted secondary structure information by PSIPRED, our method yielded a prediction accuracy of 71.5% and MCC of 0.43, 9% and 0.17 higher than the accuracy achieved based on the singe sequence information, respectively. Conclusion A new method has been developed to predict the proline cis/trans isomerization in proteins based on support vector machine, which used the single amino acid sequence with different local window sizes, the amino acid compositions of local sequence flanking centered proline residues, the position-specific scoring matrices (PSSMs) extracted by PSI-BLAST and the predicted secondary structures generated by PSIPRED. The successful application of SVM approach in this study reinforced that SVM is a powerful tool in predicting proline cis/trans isomerization in proteins and biological sequence analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Land-change science emphasizes the intimate linkages between the human and environmental components of land management systems. Recent theoretical developments in drylands identify a small set of key principles that can guide the understanding of these linkages. Using these principles, a detailed study of seven major degradation episodes over the past century in Australian grazed rangelands was reanalyzed to show a common set of events: (i) good climatic and economic conditions for a period, leading to local and regional social responses of increasing stocking rates, setting the preconditions for rapid environmental collapse, followed by (ii) a major drought coupled with a fall in the market making destocking financially unattractive, further exacerbating the pressure on the environment; then (iii) permanent or temporary declines in grazing productivity, depending on follow-up seasons coupled again with market and social conditions. The analysis supports recent theoretical developments but shows that the establishment of environmental knowledge that is strictly local may be insufficient on its own for sustainable management. Learning systems based in a wider community are needed that combine local knowledge, formal research, and institutional support. It also illustrates how natural variability in the state of both ecological and social systems can interact to precipitate nonequilibrial change in each other, so that planning cannot be based only on average conditions. Indeed, it is this variability in both environment and social subsystems that hinders the local learning required to prevent collapse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a model of computation of the parallel type, which we call 'computing with bio-agents', based on the concept that motions of biological objects such as bacteria or protein molecular motors in confined spaces can be regarded as computations. We begin with the observation that the geometric nature of the physical structures in which model biological objects move modulates the motions of the latter. Consequently, by changing the geometry, one can control the characteristic trajectories of the objects; on the basis of this, we argue that such systems are computing devices. We investigate the computing power of mobile bio-agent systems and show that they are computationally universal in the sense that they are capable of computing any Boolean function in parallel. We argue also that using appropriate conditions, bio-agent systems can solve NP-complete problems in probabilistic polynomial time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resolving a noted open problem, we show that the Undirected Feedback Vertex Set problem, parameterized by the size of the solution set of vertices, is in the parameterized complexity class Poly(k), that is, polynomial-time pre-processing is sufficient to reduce an initial problem instance (G, k) to a decision-equivalent simplified instance (G', k') where k' � k, and the number of vertices of G' is bounded by a polynomial function of k. Our main result shows an O(k11) kernelization bound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The relationship between temperature and mortality has been explored for decades and many temperature indicators have been applied separately. However, few data are available to show how the effects of different temperature indicators on different mortality categories, particularly in a typical subtropical climate. OBJECTIVE: To assess the associations between various temperature indicators and different mortality categories in Brisbane, Australia during 1996-2004. METHODS: We applied two methods to assess the threshold and temperature indicator for each age and death groups: mean temperature and the threshold assessed from all cause mortality was used for all mortality categories; the specific temperature indicator and the threshold for each mortality category were identified separately according to the minimisation of AIC. We conducted polynomial distributed lag non-linear model to identify effect estimates in mortality with one degree of temperature increase (or decrease) above (or below) the threshold on current days and lagged effects using both methods. RESULTS: Akaike's Information Criterion was minimized when mean temperature was used for all non-external deaths and deaths from 75 to 84 years; when minimum temperature was used for deaths from 0 to 64 years, 65-74 years, ≥ 85 years, and from the respiratory diseases; when maximum temperature was used for deaths from cardiovascular diseases. The effect estimates using certain temperature indicators were similar as mean temperature both for current day and lag effects. CONCLUSION: Different age groups and death categories were sensitive to different temperature indicators. However, the effect estimates from certain temperature indicators did not significantly differ from those of mean temperature.