156 resultados para Strictly Hyperbolic Polynomial


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although interests in assessing the relationship between temperature and mortality have arisen due to climate change, relatively few data are available on lag structure of temperature-mortality relationship, particularly in the Southern Hemisphere. This study identified the lag effects of mean temperature on mortality among age groups and death categories using polynomial distributed lag models in Brisbane, Australia, a subtropical city, 1996-2004. For a 1 °C increase above the threshold, the highest percent increase in mortality on the current day occurred among people over 85 years (7.2% (95% CI: 4.3%, 10.2%)). The effect estimates among cardiovascular deaths were higher than those among all-cause mortality. For a 1 °C decrease below the threshold, the percent increases in mortality at 21 lag days were 3.9% (95% CI: 1.9%, 6.0%) and 3.4% (95% CI: 0.9%, 6.0%) for people aged over 85 years and with cardiovascular diseases, respectively. These findings may have implications for developing intervention strategies to reduce and prevent temperature-related mortality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Micro-finance, which includes micro-credit as one of its core services, has become an important component of a range of business models – from those that operate on a strictly economic basis to those that come from a philanthropic base, through Non Government Organisations (NGOs). Its success is often measured by the number of loans issued, their size, and the repayment rates. This paper has a dual purpose: to identify whether the models currently used to deliver micro-credit services to the poor are socially responsible and to suggest a new model of delivery that addresses some of the social responsibility issues, while supporting community development. The proposed model is currently being implemented in Beira, the second largest city in Mozambique. Mozambique exhibits many of the characteristics found in other African countries so the model, if successful, may have implications for other poor African nations as well as other developing economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To quantify the lagged effects of mean temperature on deaths from cardiovascular diseases in Brisbane, Australia. Design Polynomial distributed lag models were used to assess the percentage increase in mortality up to 30 days associated with an increase (or decrease) of 1°C above (or below) the threshold temperature. Setting Brisbane, Australia. Patients 22 805 cardiovascular deaths registered between 1996 and 2004. Main outcome measures Deaths from cardiovascular diseases. Results The results show a longer lagged effect in cold days and a shorter lagged effect in hot days. For the hot effect, a statistically significant association was observed only for lag 0–1 days. The percentage increase in mortality was found to be 3.7% (95% CI 0.4% to 7.1%) for people aged ≥65 years and 3.5% (95% CI 0.4% to 6.7%) for all ages associated with an increase of 1°C above the threshold temperature of 24°C. For the cold effect, a significant effect of temperature was found for 10–15 lag days. The percentage estimates for older people and all ages were 3.1% (95% CI 0.7% to 5.7%) and 2.8% (95% CI 0.5% to 5.1%), respectively, with a decrease of 1°C below the threshold temperature of 24°C. Conclusions The lagged effects lasted longer for cold temperatures but were apparently shorter for hot temperatures. There was no substantial difference in the lag effect of temperature on mortality between all ages and those aged ≥65 years in Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a widely held view that architecture is very strongly and even primarily determined by the society and culture rather than geographical, technical, economic or cli-matic factors. This is especially evident in societies where rituals, customs and tradition play a significant role in the design of built forms. One such society was that of Feudal Japan under the rule of samurai warriors. The strictly controlled hierarchical society of Feudal Japan, isolated from the rest of the world for over 250 years, was able to develop the art and architecture borrowed from neighboring older cultures of China and Korea into what is now considered uniquely Japanese. One such architecture is the Sukiya style tea houses where the ritual of tea ceremony took place. This ritual was developed by the tea masters who were Zen monks or the merchants who belonged to the lowest class in the hierarchical feudal society. The Sukiya style developed from 14th to 16th century and became an architectural space that negated all the rules imposed on commoners by the samurai rulers. The tea culture had a major influence on Japanese architecture, the concept of space and aesthetics. It extended into the design of Japanese gardens, clothes, presentation of food, and their manners in day to day life. The focus of this paper is the Japanese ritual of tea ceremony, the architecture of the tea house it inspired, the society responsible for its creation and the culture that promoted its popularity and its continuation into the 21st century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The majority of peptide bonds in proteins are found to occur in the trans conformation. However, for proline residues, a considerable fraction of Prolyl peptide bonds adopt the cis form. Proline cis/trans isomerization is known to play a critical role in protein folding, splicing, cell signaling and transmembrane active transport. Accurate prediction of proline cis/trans isomerization in proteins would have many important applications towards the understanding of protein structure and function. Results In this paper, we propose a new approach to predict the proline cis/trans isomerization in proteins using support vector machine (SVM). The preliminary results indicated that using Radial Basis Function (RBF) kernels could lead to better prediction performance than that of polynomial and linear kernel functions. We used single sequence information of different local window sizes, amino acid compositions of different local sequences, multiple sequence alignment obtained from PSI-BLAST and the secondary structure information predicted by PSIPRED. We explored these different sequence encoding schemes in order to investigate their effects on the prediction performance. The training and testing of this approach was performed on a newly enlarged dataset of 2424 non-homologous proteins determined by X-Ray diffraction method using 5-fold cross-validation. Selecting the window size 11 provided the best performance for determining the proline cis/trans isomerization based on the single amino acid sequence. It was found that using multiple sequence alignments in the form of PSI-BLAST profiles could significantly improve the prediction performance, the prediction accuracy increased from 62.8% with single sequence to 69.8% and Matthews Correlation Coefficient (MCC) improved from 0.26 with single local sequence to 0.40. Furthermore, if coupled with the predicted secondary structure information by PSIPRED, our method yielded a prediction accuracy of 71.5% and MCC of 0.43, 9% and 0.17 higher than the accuracy achieved based on the singe sequence information, respectively. Conclusion A new method has been developed to predict the proline cis/trans isomerization in proteins based on support vector machine, which used the single amino acid sequence with different local window sizes, the amino acid compositions of local sequence flanking centered proline residues, the position-specific scoring matrices (PSSMs) extracted by PSI-BLAST and the predicted secondary structures generated by PSIPRED. The successful application of SVM approach in this study reinforced that SVM is a powerful tool in predicting proline cis/trans isomerization in proteins and biological sequence analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Land-change science emphasizes the intimate linkages between the human and environmental components of land management systems. Recent theoretical developments in drylands identify a small set of key principles that can guide the understanding of these linkages. Using these principles, a detailed study of seven major degradation episodes over the past century in Australian grazed rangelands was reanalyzed to show a common set of events: (i) good climatic and economic conditions for a period, leading to local and regional social responses of increasing stocking rates, setting the preconditions for rapid environmental collapse, followed by (ii) a major drought coupled with a fall in the market making destocking financially unattractive, further exacerbating the pressure on the environment; then (iii) permanent or temporary declines in grazing productivity, depending on follow-up seasons coupled again with market and social conditions. The analysis supports recent theoretical developments but shows that the establishment of environmental knowledge that is strictly local may be insufficient on its own for sustainable management. Learning systems based in a wider community are needed that combine local knowledge, formal research, and institutional support. It also illustrates how natural variability in the state of both ecological and social systems can interact to precipitate nonequilibrial change in each other, so that planning cannot be based only on average conditions. Indeed, it is this variability in both environment and social subsystems that hinders the local learning required to prevent collapse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a model of computation of the parallel type, which we call 'computing with bio-agents', based on the concept that motions of biological objects such as bacteria or protein molecular motors in confined spaces can be regarded as computations. We begin with the observation that the geometric nature of the physical structures in which model biological objects move modulates the motions of the latter. Consequently, by changing the geometry, one can control the characteristic trajectories of the objects; on the basis of this, we argue that such systems are computing devices. We investigate the computing power of mobile bio-agent systems and show that they are computationally universal in the sense that they are capable of computing any Boolean function in parallel. We argue also that using appropriate conditions, bio-agent systems can solve NP-complete problems in probabilistic polynomial time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resolving a noted open problem, we show that the Undirected Feedback Vertex Set problem, parameterized by the size of the solution set of vertices, is in the parameterized complexity class Poly(k), that is, polynomial-time pre-processing is sufficient to reduce an initial problem instance (G, k) to a decision-equivalent simplified instance (G', k') where k' � k, and the number of vertices of G' is bounded by a polynomial function of k. Our main result shows an O(k11) kernelization bound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The relationship between temperature and mortality has been explored for decades and many temperature indicators have been applied separately. However, few data are available to show how the effects of different temperature indicators on different mortality categories, particularly in a typical subtropical climate. OBJECTIVE: To assess the associations between various temperature indicators and different mortality categories in Brisbane, Australia during 1996-2004. METHODS: We applied two methods to assess the threshold and temperature indicator for each age and death groups: mean temperature and the threshold assessed from all cause mortality was used for all mortality categories; the specific temperature indicator and the threshold for each mortality category were identified separately according to the minimisation of AIC. We conducted polynomial distributed lag non-linear model to identify effect estimates in mortality with one degree of temperature increase (or decrease) above (or below) the threshold on current days and lagged effects using both methods. RESULTS: Akaike's Information Criterion was minimized when mean temperature was used for all non-external deaths and deaths from 75 to 84 years; when minimum temperature was used for deaths from 0 to 64 years, 65-74 years, ≥ 85 years, and from the respiratory diseases; when maximum temperature was used for deaths from cardiovascular diseases. The effect estimates using certain temperature indicators were similar as mean temperature both for current day and lag effects. CONCLUSION: Different age groups and death categories were sensitive to different temperature indicators. However, the effect estimates from certain temperature indicators did not significantly differ from those of mean temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, research on P-O fit has focused heavily on the effect of P-O fit on individual and organisational outcomes. Few studies have attempted to explain how or why P-O fit leads to these outcomes. Meglino, Ravlin, and Adkins (1989) and Schein (1985) identified several intervening mechanisms for explaining fit-outcome relationships but only few of these explanations have been tested empirically (Cable & Edwards, 2004; Edwards & Cable, 2009; Kalliath, Bluedorn, & Strube, 1999). This thesis investigates role conflict, cognitive style and organisational justice as three potential mediating mechanisms in the relationship between P-O fit (defined as fit between personal and organisational values – value congruence or value fit) and outcomes including job satisfaction, job performance, service performance, affective commitment and continuance commitment. The study operationalised P-O fit using three measures: subjective fit, perceived fit and objective fit. The mediation model of subjective fit was tested using a Mplus analytical technique, while the mediation models of both perceived and objective fit were tested by modeling the difference between two scores (that is, between personal values and organisational values) using a polynomial regression and response surface analysis (Edwards, 1993). A survey of 558 mid-level managers from seven Brunei public sector organisations provided the data. Our results showed that the relationship between P-O fit and outcomes was partially mediated by organisational justice and cognitive style - for all the three measures of fit, while role conflict had no mediating effects. The findings from this research therefore have both theoretical and practical implications. This research contributes to the literature by combining these theoretical explanations for value congruence effects into one integrated model, and by providing evidence on the partial mediating effects of organisational justice and cognitive style. Future research needs to address and investigate other potential mechanisms by which value congruence affects individual and organisational outcomes. In addition, the study is considered to be the first to test these mediating roles for a value fit-outcomes relationship using three different measures of fit in a non-Western context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Velocity jump processes are discrete random walk models that have many applications including the study of biological and ecological collective motion. In particular, velocity jump models are often used to represent a type of persistent motion, known as a “run and tumble”, which is exhibited by some isolated bacteria cells. All previous velocity jump processes are non-interacting, which means that crowding effects and agent-to-agent interactions are neglected. By neglecting these agent-to-agent interactions, traditional velocity jump models are only applicable to very dilute systems. Our work is motivated by the fact that many applications in cell biology, such as wound healing, cancer invasion and development, often involve tissues that are densely packed with cells where cell-to-cell contact and crowding effects can be important. To describe these kinds of high cell density problems using a velocity jump process we introduce three different classes of crowding interactions into a one-dimensional model. Simulation data and averaging arguments lead to a suite of continuum descriptions of the interacting velocity jump processes. We show that the resulting systems of hyperbolic partial differential equations predict the mean behavior of the stochastic simulations very well.