63 resultados para transactional distance theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a discipline, logic is arguably constituted of two main sub-projects: formal theories of argument validity on the basis of a small number of patterns, and theories of how to reduce the multiplicity of arguments in non-logical, informal contexts to the small number of patterns whose validity is systematically studied (i.e. theories of formalization). Regrettably, we now tend to view logic 'proper' exclusively as what falls under the first sub-project, to the neglect of the second, equally important sub-project. In this paper, I discuss two historical theories of argument formalization: Aristotle's syllogistic theory as presented in the "Prior Analytics", and medieval theories of supposition. They both illustrate this two-fold nature of logic, containing in particular illuminating reflections on how to formalize arguments (i.e. the second sub-project). In both cases, the formal methods employed differ from the usual modern technique of translating an argument in ordinary language into a specially designed symbolism, a formal language. The upshot is thus a plea for a broader conceptualization of what it means to formalize.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Web-based e-learning is a teaching tool increasingly used in many medical schools and specialist fields, including ophthalmology. AIMS: this pilot study aimed to develop internet-based course-based clinical cases and to evaluate the effectiveness of this method within a graduate medical education group. METHODS: this was an interventional randomized study. First, a website was built using a distance learning platform. Sixteen first-year ophthalmology residents were then divided into two randomized groups: one experimental group, which was submitted to the intervention (use of the e-learning site) and another control group, which was not submitted to the intervention. The students answered a printed clinical case and their scores were compared. RESULTS: there was no statistically significant difference between the groups. CONCLUSION: We were able to successfully develop the e-learning site and the respective clinical cases. Despite the fact that there was no statistically significant difference between the access and the non access group, the study was a pioneer in our department, since a clinical case online program had never previously been developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a historical companion to a previous one, in which it was studied the so-called abstract Galois theory as formulated by the Portuguese mathematician José Sebastião e Silva (see da Costa, Rodrigues (2007)). Our purpose is to present some applications of abstract Galois theory to higher-order model theory, to discuss Silva's notion of expressibility and to outline a classical Galois theory that can be obtained inside the two versions of the abstract theory, those of Mark Krasner and of Silva. Some comments are made on the universal theory of (set-theoretic) structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT When Hume, in the Treatise on Human Nature, began his examination of the relation of cause and effect, in particular, of the idea of necessary connection which is its essential constituent, he identified two preliminary questions that should guide his research: (1) For what reason we pronounce it necessary that every thing whose existence has a beginning should also have a cause and (2) Why we conclude that such particular causes must necessarily have such particular effects? (1.3.2, 14-15) Hume observes that our belief in these principles can result neither from an intuitive grasp of their truth nor from a reasoning that could establish them by demonstrative means. In particular, with respect to the first, Hume examines and rejects some arguments with which Locke, Hobbes and Clarke tried to demonstrate it, and suggests, by exclusion, that the belief that we place on it can only come from experience. Somewhat surprisingly, however, Hume does not proceed to show how that derivation of experience could be made, but proposes instead to move directly to an examination of the second principle, saying that, "perhaps, be found in the end, that the same answer will serve for both questions" (1.3.3, 9). Hume's answer to the second question is well known, but the first question is never answered in the rest of the Treatise, and it is even doubtful that it could be, which would explain why Hume has simply chosen to remove any mention of it when he recompiled his theses on causation in the Enquiry concerning Human Understanding. Given this situation, an interesting question that naturally arises is to investigate the relations of logical or conceptual implication between these two principles. Hume seems to have thought that an answer to (2) would also be sufficient to provide an answer to (1). Henry Allison, in his turn, argued (in Custom and Reason in Hume, p. 94-97) that the two questions are logically independent. My proposal here is to try to show that there is indeed a logical dependency between them, but the implication is, rather, from (1) to (2). If accepted, this result may be particularly interesting for an interpretation of the scope of the so-called "Kant's reply to Hume" in the Second Analogy of Experience, which is structured as a proof of the a priori character of (1), but whose implications for (2) remain controversial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 7.2% of the Atlantic rainforest remains in Brazil, with only 16% of this forest remaining in the State of Rio de Janeiro, all of it distributed in fragments. This forest fragmentation can produce biotic and abiotic differences between edges and the fragment interior. In this study, we compared the structure and richness of tree communities in three habitats - an anthropogenic edge (AE), a natural edge (NE) and the fragment interior (FI) - of a fragment of Atlantic forest in the State of Rio de Janeiro, Brazil (22°50'S and 42°28'W). One thousand and seventy-six trees with a diameter at breast height > 4.8 cm, belonging to 132 morphospecies and 39 families, were sampled in a total study area of 0.75 ha. NE had the greatest basal area and the trees in this habitat had the greatest diameter:height allometric coefficient, whereas AE had a lower richness and greater variation in the height of the first tree branch. Tree density, diameter, height and the proportion of standing dead trees did not differ among the habitats. There was marked heterogeneity among replicates within each habitat. These results indicate that the forest interior and the fragment edges (natural or anthropogenic) do not differ markedly considering the studied parameters. Other factors, such as the age from the edge, type of matrix and proximity of gaps, may play a more important role in plant community structure than the proximity from edges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The implementation of local geodetic networks for georeferencing of rural properties has become a requirement after publication of the Georeferencing Technical Standard by INCRA. According to this standard, the maximum distance of baselines to GNSS L1 receivers is of 20 km. Besides the length of the baseline, the geometry and the number of geodetic control stations are other factors to be considered in the implementation of geodetic networks. Thus, this research aimed to examine the influence of baseline lengths higher than the regulated limit of 20 km, the geometry and the number of control stations on quality of local geodetic networks for georeferencing, and also to demonstrate the importance of using specific tests to evaluate the solution of ambiguities and on the quality of the adjustment. The results indicated that the increasing number of control stations has improved the quality of the network, the geometry has not influenced on the quality and the baseline length has influenced on the quality; however, lengths higher than 20 km has not interrupted the implementation, with GPS L1 receiver, of the local geodetic network for the purpose of georeferencing. Also, the use of different statistical tests, both for the evaluation of the resolution of ambiguities and for the adjustment, have enabled greater clearness in analyzing the results, which allow that unsuitable observations may be eliminated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of the Extreme Value Theory (EVT) to model the probability of occurrence of extreme low Standardized Precipitation Index (SPI) values leads to an increase of the knowledge related to the occurrence of extreme dry months. This sort of analysis can be carried out by means of two approaches: the block maxima (BM; associated with the General Extreme Value distribution) and the peaks-over-threshold (POT; associated with the Generalized Pareto distribution). Each of these procedures has its own advantages and drawbacks. Thus, the main goal of this study is to compare the performance of BM and POT in characterizing the probability of occurrence of extreme dry SPI values obtained from the weather station of Ribeirão Preto-SP (1937-2012). According to the goodness-of-fit tests, both BM and POT can be used to assess the probability of occurrence of the aforementioned extreme dry SPI monthly values. However, the scalar measures of accuracy and the return level plots indicate that POT provides the best fit distribution. The study also indicated that the uncertainties in the parameters estimates of a probabilistic model should be taken into account when the probability associated with a severe/extreme dry event is under analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a systematic and quantitative view is presented for the application of the theory of constraints in manufacturing. This is done employing the operational research technique of mathematical programming. The potential of the theory of constraints in automated manufacturing is demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper is Analyzed the local dynamical behavior of a slewing flexible structure considering nonlinear curvature. The dynamics of the original (nonlinear) governing equations of motion are reduced to the center manifold in the neighborhood of an equilibrium solution with the purpose of locally study the stability of the system. In this critical point, a Hopf bifurcation occurs. In this region, one can find values for the control parameter (structural damping coefficient) where the system is unstable and values where the system stability is assured (periodic motion). This local analysis of the system reduced to the center manifold assures the stable / unstable behavior of the original system around a known solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Glyphosate is an herbicide that inhibits the enzyme 5-enolpyruvyl-shikimate-3-phosphate synthase (EPSPs) (EC 2.5.1.19). EPSPs is the sixth enzyme of the shikimate pathway, by which plants synthesize the aromatic amino acids phenylalanine, tyrosine, and tryptophan and many compounds used in secondary metabolism pathways. About fifteen years ago it was hypothesized that it was unlikely weeds would evolve resistance to this herbicide because of the limited degree of glyphosate metabolism observed in plants, the low resistance level attained to EPSPs gene overexpression, and because of the lower fitness in plants with an altered EPSPs enzyme. However, today 20 weed species have been described with glyphosate resistant biotypes that are found in all five continents of the world and exploit several different resistant mechanisms. The survival and adaptation of these glyphosate resistant weeds are related toresistance mechanisms that occur in plants selected through the intense selection pressure from repeated and exclusive use of glyphosate as the only control measure. In this paper the physiological, biochemical, and genetic basis of glyphosate resistance mechanisms in weed species are reviewed and a novel and innovative theory that integrates all the mechanisms of non-target site glyphosate resistance in plants is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two experiments were carried out to evaluate the initial plant growth of Eucalyptus urograndis growing in coexistence with Urochloa decumbens and U. ruziziensis. In 100-L box, one plant of U. decumbens or U. ruziziensis grew in coexistence with one plant of E. urograndis clones C219H or H15, respectively, in the distances of 0, 5, 10, 15, 20, 25, 30, 35, and 40 cm from the crop. After 30, 60, 90 (both clones), and 150 days (just for H15), growth characteristics were evaluated. Plants of both clones, growing in weed-free situations, showed a better growth and development than plants that grew in weedy situations, independently of the distance, having the highest plant height, stem diameter, dry mass of stem, and dry mass of leaves. As the same way, the number of branches, number of leaves, and leaf area of the clone C219H were similarly affected. Urochloa ruziziensis reduced the dry mass accumulation of stem and leaves by the rate of 0.06 and 0.32 g per plant, respectively, per each centimeter growing nearest to the crop, while U. decumbens reduced by 0.03 and 0.14 g per plant. The interference of U. decumbens and U. ruziziensis with E. urograndis is more intense when weedy plants grow in short distances from the crop.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organismic-centered Darwinism, in order to use direct phenotypes to measure natural selection's effect, necessitates genome's harmony and uniform coherence plus large population sizes. However, modern gene-centered Darwinism has found new interpretations to data that speak of genomic incoherence and disharmony. As a result of these two conflicting positions a conceptual crisis in Biology has arisen. My position is that the presence of small, even pocket-size, demes is instrumental in generating divergence and phenotypic crisis. Moreover, the presence of parasitic genomes as in acanthocephalan worms, which even manipulate suicidal behavior in their hosts; segregation distorters that change meiosis and Mendelian ratios; selfish genes and selfish whole chromosomes, such as the case of B-chromosomes in grasshoppers; P-elements in Drosophila; driving Y-chromosomes that manipulate sex ratios making males more frequent, as in Hamilton's X-linked drive; male strategists and outlaw genes, are eloquent examples of the presence of real conflicting genomes and of a non-uniform phenotypic coherence and genome harmony. Thus, we are proposing that overall incoherence and disharmony generate disorder but also more biodiversity and creativeness. Finally, if genes can manipulate natural selection, they can multiply mutations or undesirable characteristics and even lethal or detrimental ones, hence the accumulation of genetic loads. Outlaw genes can change what is adaptively convenient even in the direction of the trait that is away from the optimum. The optimum can be "negotiated" among the variants, not only because pleiotropic effects demand it, but also, in some cases, because selfish, outlaw, P-elements or extended phenotypic manipulation require it. With organismic Darwinism the genome in the population and in the individual was thought to act harmoniously without conflicts, and genotypes were thought to march towards greater adaptability. Modern Darwinism has a gene-centered vision in which genes, as natural selection's objects can move in dissonance in the direction which benefits their multiplication. Thus, we have greater opportunities for genomes in permanent conflict.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study compares the performance of stochastic and fuzzy models for the analysis of the relationship between clinical signs and diagnosis. Data obtained for 153 children concerning diagnosis (pneumonia, other non-pneumonia diseases, absence of disease) and seven clinical signs were divided into two samples, one for analysis and other for validation. The former was used to derive relations by multi-discriminant analysis (MDA) and by fuzzy max-min compositions (fuzzy), and the latter was used to assess the predictions drawn from each type of relation. MDA and fuzzy were closely similar in terms of prediction, with correct allocation of 75.7 to 78.3% of patients in the validation sample, and displaying only a single instance of disagreement: a patient with low level of toxemia was mistaken as not diseased by MDA and correctly taken as somehow ill by fuzzy. Concerning relations, each method provided different information, each revealing different aspects of the relations between clinical signs and diagnoses. Both methods agreed on pointing X-ray, dyspnea, and auscultation as better related with pneumonia, but only fuzzy was able to detect relations of heart rate, body temperature, toxemia and respiratory rate with pneumonia. Moreover, only fuzzy was able to detect a relationship between heart rate and absence of disease, which allowed the detection of six malnourished children whose diagnoses as healthy are, indeed, disputable. The conclusion is that even though fuzzy sets theory might not improve prediction, it certainly does enhance clinical knowledge since it detects relationships not visible to stochastic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.