62 resultados para Frame theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT When Hume, in the Treatise on Human Nature, began his examination of the relation of cause and effect, in particular, of the idea of necessary connection which is its essential constituent, he identified two preliminary questions that should guide his research: (1) For what reason we pronounce it necessary that every thing whose existence has a beginning should also have a cause and (2) Why we conclude that such particular causes must necessarily have such particular effects? (1.3.2, 14-15) Hume observes that our belief in these principles can result neither from an intuitive grasp of their truth nor from a reasoning that could establish them by demonstrative means. In particular, with respect to the first, Hume examines and rejects some arguments with which Locke, Hobbes and Clarke tried to demonstrate it, and suggests, by exclusion, that the belief that we place on it can only come from experience. Somewhat surprisingly, however, Hume does not proceed to show how that derivation of experience could be made, but proposes instead to move directly to an examination of the second principle, saying that, "perhaps, be found in the end, that the same answer will serve for both questions" (1.3.3, 9). Hume's answer to the second question is well known, but the first question is never answered in the rest of the Treatise, and it is even doubtful that it could be, which would explain why Hume has simply chosen to remove any mention of it when he recompiled his theses on causation in the Enquiry concerning Human Understanding. Given this situation, an interesting question that naturally arises is to investigate the relations of logical or conceptual implication between these two principles. Hume seems to have thought that an answer to (2) would also be sufficient to provide an answer to (1). Henry Allison, in his turn, argued (in Custom and Reason in Hume, p. 94-97) that the two questions are logically independent. My proposal here is to try to show that there is indeed a logical dependency between them, but the implication is, rather, from (1) to (2). If accepted, this result may be particularly interesting for an interpretation of the scope of the so-called "Kant's reply to Hume" in the Second Analogy of Experience, which is structured as a proof of the a priori character of (1), but whose implications for (2) remain controversial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of the Extreme Value Theory (EVT) to model the probability of occurrence of extreme low Standardized Precipitation Index (SPI) values leads to an increase of the knowledge related to the occurrence of extreme dry months. This sort of analysis can be carried out by means of two approaches: the block maxima (BM; associated with the General Extreme Value distribution) and the peaks-over-threshold (POT; associated with the Generalized Pareto distribution). Each of these procedures has its own advantages and drawbacks. Thus, the main goal of this study is to compare the performance of BM and POT in characterizing the probability of occurrence of extreme dry SPI values obtained from the weather station of Ribeirão Preto-SP (1937-2012). According to the goodness-of-fit tests, both BM and POT can be used to assess the probability of occurrence of the aforementioned extreme dry SPI monthly values. However, the scalar measures of accuracy and the return level plots indicate that POT provides the best fit distribution. The study also indicated that the uncertainties in the parameters estimates of a probabilistic model should be taken into account when the probability associated with a severe/extreme dry event is under analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytical study of the nonlinear vibrations of a multiple machines portal frame foundation is presented. Two unbalanced rotating machines are considered, none of them resonant with the lower natural frequencies of the supporting structure. Their combined frequencies is set in such a way as to excite, due to nonlinear behavior of the frame, either the first anti-symmetrical mode (sway) or the first symmetrical mode. The physical and geometrical characteristics of the frame are chosen to tune the natural frequencies of these two modes into a 1:2 internal resonance. The problem is reduced to a two degrees of freedom model and its nonlinear equations of motions are derived via a Lagrangian approach. Asymptotic perturbation solutions of these equations are obtained via the Multiple Scales Method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a systematic and quantitative view is presented for the application of the theory of constraints in manufacturing. This is done employing the operational research technique of mathematical programming. The potential of the theory of constraints in automated manufacturing is demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper is Analyzed the local dynamical behavior of a slewing flexible structure considering nonlinear curvature. The dynamics of the original (nonlinear) governing equations of motion are reduced to the center manifold in the neighborhood of an equilibrium solution with the purpose of locally study the stability of the system. In this critical point, a Hopf bifurcation occurs. In this region, one can find values for the control parameter (structural damping coefficient) where the system is unstable and values where the system stability is assured (periodic motion). This local analysis of the system reduced to the center manifold assures the stable / unstable behavior of the original system around a known solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We apply the Bogoliubov Averaging Method to the study of the vibrations of an elastic foundation, forced by a Non-ideal energy source. The considered model consists of a portal plane frame with quadratic nonlinearities, with internal resonance 1:2, supporting a direct current motor with limited power. The non-ideal excitation is in primary resonance in the order of one-half with the second mode frequency. The results of the averaging method, plotted in time evolution curve and phase diagrams are compared to those obtained by numerically integrating of the original differential equations. The presence of the saturation phenomenon is verified by analytical procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Glyphosate is an herbicide that inhibits the enzyme 5-enolpyruvyl-shikimate-3-phosphate synthase (EPSPs) (EC 2.5.1.19). EPSPs is the sixth enzyme of the shikimate pathway, by which plants synthesize the aromatic amino acids phenylalanine, tyrosine, and tryptophan and many compounds used in secondary metabolism pathways. About fifteen years ago it was hypothesized that it was unlikely weeds would evolve resistance to this herbicide because of the limited degree of glyphosate metabolism observed in plants, the low resistance level attained to EPSPs gene overexpression, and because of the lower fitness in plants with an altered EPSPs enzyme. However, today 20 weed species have been described with glyphosate resistant biotypes that are found in all five continents of the world and exploit several different resistant mechanisms. The survival and adaptation of these glyphosate resistant weeds are related toresistance mechanisms that occur in plants selected through the intense selection pressure from repeated and exclusive use of glyphosate as the only control measure. In this paper the physiological, biochemical, and genetic basis of glyphosate resistance mechanisms in weed species are reviewed and a novel and innovative theory that integrates all the mechanisms of non-target site glyphosate resistance in plants is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organismic-centered Darwinism, in order to use direct phenotypes to measure natural selection's effect, necessitates genome's harmony and uniform coherence plus large population sizes. However, modern gene-centered Darwinism has found new interpretations to data that speak of genomic incoherence and disharmony. As a result of these two conflicting positions a conceptual crisis in Biology has arisen. My position is that the presence of small, even pocket-size, demes is instrumental in generating divergence and phenotypic crisis. Moreover, the presence of parasitic genomes as in acanthocephalan worms, which even manipulate suicidal behavior in their hosts; segregation distorters that change meiosis and Mendelian ratios; selfish genes and selfish whole chromosomes, such as the case of B-chromosomes in grasshoppers; P-elements in Drosophila; driving Y-chromosomes that manipulate sex ratios making males more frequent, as in Hamilton's X-linked drive; male strategists and outlaw genes, are eloquent examples of the presence of real conflicting genomes and of a non-uniform phenotypic coherence and genome harmony. Thus, we are proposing that overall incoherence and disharmony generate disorder but also more biodiversity and creativeness. Finally, if genes can manipulate natural selection, they can multiply mutations or undesirable characteristics and even lethal or detrimental ones, hence the accumulation of genetic loads. Outlaw genes can change what is adaptively convenient even in the direction of the trait that is away from the optimum. The optimum can be "negotiated" among the variants, not only because pleiotropic effects demand it, but also, in some cases, because selfish, outlaw, P-elements or extended phenotypic manipulation require it. With organismic Darwinism the genome in the population and in the individual was thought to act harmoniously without conflicts, and genotypes were thought to march towards greater adaptability. Modern Darwinism has a gene-centered vision in which genes, as natural selection's objects can move in dissonance in the direction which benefits their multiplication. Thus, we have greater opportunities for genomes in permanent conflict.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study compares the performance of stochastic and fuzzy models for the analysis of the relationship between clinical signs and diagnosis. Data obtained for 153 children concerning diagnosis (pneumonia, other non-pneumonia diseases, absence of disease) and seven clinical signs were divided into two samples, one for analysis and other for validation. The former was used to derive relations by multi-discriminant analysis (MDA) and by fuzzy max-min compositions (fuzzy), and the latter was used to assess the predictions drawn from each type of relation. MDA and fuzzy were closely similar in terms of prediction, with correct allocation of 75.7 to 78.3% of patients in the validation sample, and displaying only a single instance of disagreement: a patient with low level of toxemia was mistaken as not diseased by MDA and correctly taken as somehow ill by fuzzy. Concerning relations, each method provided different information, each revealing different aspects of the relations between clinical signs and diagnoses. Both methods agreed on pointing X-ray, dyspnea, and auscultation as better related with pneumonia, but only fuzzy was able to detect relations of heart rate, body temperature, toxemia and respiratory rate with pneumonia. Moreover, only fuzzy was able to detect a relationship between heart rate and absence of disease, which allowed the detection of six malnourished children whose diagnoses as healthy are, indeed, disputable. The conclusion is that even though fuzzy sets theory might not improve prediction, it certainly does enhance clinical knowledge since it detects relationships not visible to stochastic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since financial liberalization in the 1980s, non-profit maximizing, stakeholder-oriented banks have outperformed private banks in Europe. This article draws on empirical research, banking theory and theories of the firm to explain this apparent anomaly for neo-liberal policy and contemporary market-based banking theory. The realization of competitive advantages by alternative banks (savings banks, cooperative banks and development banks) has significant implications for conceptions of bank change, regulation and political economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses reasons of the instability of the world monetary system. The author considers this problem from historical and contemporary perspectives. According to presented point of view banknotes and electronic money which replaced gold and silver coins in popular circulation are the most important reason of the instability. There are also proven positive and negative consequences of money instability. Reforms of the world monetary system need agreement within the global collective hegemony of state-powers and transnational corporations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Textbook theory ignores capital flows: trade determines exchange rates and specialisation. Approaches taking the effects of capital movements adequately into account are needed, and a new theory of economic policy including measures to protect the real economy from external volatility. Equilibrating textbook mechanisms cannot work unless trade-caused surpluses and deficits set exchange rates. To allow orthodox trade theory to work one must hinder capital flows from destroying its very basis, which the IMF and wrong regulatory decisions have done, penalising production and trade. A new, real economy based theory is proposed, a Neoclassical agenda of controlling capital flows and speculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many economists show certain nonconformity relative to the excessive mathematical formalization of economics. This stems from dissatisfaction with the old debate about the lack of correspondence between mainstream theoretical models and reality. Although we do not propose to settle this debate here, this article seeks to associate the mismatch of mathematized models with the reality of the adoption of the hypothetical-deductive method as reproduced by general equilibrium. We begin by defining the main benefits of the mathematization of economics. Secondly, we address traditional criticism leveled against it. We then focus on more recent criticism from Gillies (2005) and Bresser-Pereira (2008). Finally, we attempt to associate the reproduction of the hypothetical-deductive method with a metatheoretical process triggered by Debreu's general equilibrium theory. In this respect, we appropriate the ideas of Weintraub (2002), Punzo (1991), and mainly Woo (1986) to support our hypothesis.