94 resultados para A priori
Resumo:
Husserl left many unpublished drafts explaining (or trying to) his views on spatial representation and geometry, such as, particularly, those collected in the second part of Studien zur Arithmetik und Geometrie (Hua XXI), but no completely articulate work on the subject. In this paper, I put forward an interpretation of what those views might have been. Husserl, I claim, distinguished among different conceptions of space, the space of perception (constituted from sensorial data by intentionally motivated psychic functions), that of physical geometry (or idealized perceptual space), the space of the mathematical science of physical nature (in which science, not only raw perception has a word) and the abstract spaces of mathematics (free creations of the mathematical mind), each of them with its peculiar geometrical structure. Perceptual space is proto-Euclidean and the space of physical geometry Euclidean, but mathematical physics, Husserl allowed, may find it convenient to represent physical space with a non-Euclidean structure. Mathematical spaces, on their turn, can be endowed, he thinks, with any geometry mathematicians may find interesting. Many other related questions are addressed here, in particular those concerning the a priori or a posteriori character of the many geometric features of perceptual space (bearing in mind that there are at least two different notions of a priori in Husserl, which we may call the conceptual and the transcendental a priori). I conclude with an overview of Weyl's ideas on the matter, since his philosophical conceptions are often traceable back to his former master, Husserl.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A produção científica e filosófica de Charles Sanders PEIRCE (1839-1914), exigindo como critério para o trabalho intelectual e para a conduta da vida do pensador o absoluto rigor na construção dos conceitos e a estrita verificação experimental, teve por conseqüência desvincular o trabalho científico e filosófico de qualquer função apologética. A afirmação de que todo conhecimento do mundo da experiência e mesmo daquele elaborado pela matemática é intrinsecamente provável e falível se opôs a todo e qualquer dogmatismo e mesmo ao a priori de tradição Kantiana. O interesse pela teoria evolucionista e a coerência inabalável da filosofia e das atitudes de PEIRCE, como professor e pesquisador, encontraram profunda resistência no meio universitário e editorial de seu tempo. Num momento de grave crise na Universidade norte-americana, decorrente das transformações econômicas e políticas ocorridas com a guerra da Secessão (1861-1865), o posicionamento de PEIRCE contribuiu muito provavelmente para sua demissão como professor das Universidades de Harvard e de John Hopkins; para dificultar a publicação de seus escritos e para seu total isolamento nos últimos anos de vida.
Resumo:
O objetivo desse artigo é problematizar a função social da escola na sociedade capitalista atual. Tendo historicamente a função da transmissão do conhecimento, a escola, nas últimas décadas, vem passando por sérias dificuldades em cumprir, ao menos em parte, essa função social. Estudantes e professores não se identificam mais nesse espaço institucional, uns porque não aprendem e outros porque não conseguem ensinar. O estranhamento dos agentes sociais em suas relações na escola faz com que sentidos e significados se percam no processo pedagógico. As motivações de ambos se distanciam na atividade, esvaziando as ações e o alcance dos objetivos que, a priori, deveriam ser comuns. O que está na base desse conflito? A escola teria esgotado sua função social, esvaindo-se como espaço de produção e socialização de conhecimentos? A contribuição do olhar sociológico nessa problemática deve se constituir como um desafio às Ciências Sociais, de modo geral, e ao ensino de Sociologia, de modo particular?
Resumo:
A comparative study of aggregation error bounds for the generalized transportation problem is presented. A priori and a posteriori error bounds were derived and a computational study was performed to (a) test the correlation between the a priori, the a posteriori, and the actual error and (b) quantify the difference of the error bounds from the actual error. Based on the results we conclude that calculating the a priori error bound can be considered as a useful strategy to select the appropriate aggregation level. The a posteriori error bound provides a good quantitative measure of the actual error.
Resumo:
Aggregation disaggregation is used to reduce the analysis of a large generalized transportation problem to a smaller one. Bounds for the actual difference between the aggregated objective and the original optimal value are used to quantify the error due to aggregation and estimate the quality of the aggregation. The bounds can be calculated either before optimization of the aggregated problem (a priori) or after (a posteriori). Both types of the bounds are derived and numerically compared. A computational experiment was designed to (a) study the correlation between the bounds and the actual error and (b) quantify the difference of the error bounds from the actual error. The experiment shows a significant correlation between some a priori bounds, the a posteriori bounds and the actual error. These preliminary results indicate that calculating the a priori error bound is a useful strategy to select the appropriate aggregation level, since the a priori bound varies in the same way that the actual error does. After the aggregated problem has been selected and optimized, the a posteriori bound provides a good quantitative measure for the error due to aggregation.
Resumo:
In the quark model of the nucleon, the Fermi statistics of the elementary constituents can influence significantly the properties of multinucleon bound systems. In the Skyrme model, on the other hand, the basic quanta are bosons, so that qualitatively different statistics effects can be expected a priori. In order to illustrate this point, we construct schematic one-dimensional quark and soliton models which yield fermionic nucleons with identical baryon densities. We then compare the baryon densities of a two-nucleon bound state in both models. Whereas in the quark model the Pauli principle for quarks leads to a depletion of the density in the central region of the nucleus, the soliton model predicts a slight increase of the density in that region, due to the bosonic statistics of the meson-field quanta.
Resumo:
This paper presents the principal results of a detailed study about the use of the Meaningful Fractal Fuzzy Dimension measure in the problem in determining adequately the topological dimension of output space of a Self-Organizing Map. This fractal measure is conceived by combining the Fractals Theory and Fuzzy Approximate Reasoning. In this work this measure was applied on the dataset in order to obtain a priori knowledge, which is used to support the decision making about the SOM output space design. Several maps were designed with this approach and their evaluations are discussed here.
Resumo:
In recent years, many researchers in the field of biomedical sciences have made successful use of mathematical models to study, in a quantitative way, a multitude of phenomena such as those found in disease dynamics, control of physiological systems, optimization of drug therapy, economics of the preventive medicine and many other applications. The availability of good dynamic models have been providing means for simulation and design of novel control strategies in the context of biological events. This work concerns a particular model related to HIV infection dynamics which is used to allow a comparative evaluation of schemes for treatment of AIDS patients. The mathematical model adopted in this work was proposed by Nowak & Bangham, 1996 and describes the dynamics of viral concentration in terms of interaction with CD4 cells and the cytotoxic T lymphocytes, which are responsible for the defense of the organism. Two conceptually distinct techniques for drug therapy are analyzed: Open Loop Treatment, where a priori fixed dosage is prescribed and Closed Loop Treatment, where the doses are adjusted according to results obtained by laboratory analysis. Simulation results show that the Closed Loop Scheme can achieve improved quality of the treatment in terms of reduction in the viral load and quantity of administered drugs, but with the inconvenience related to the necessity of frequent and periodic laboratory analysis.
Resumo:
This paper proposes a methodology for edge detection in digital images using the Canny detector, but associated with a priori edge structure focusing by a nonlinear anisotropic diffusion via the partial differential equation (PDE). This strategy aims at minimizing the effect of the well-known duality of the Canny detector, under which is not possible to simultaneously enhance the insensitivity to image noise and the localization precision of detected edges. The process of anisotropic diffusion via thePDE is used to a priori focus the edge structure due to its notable characteristic in selectively smoothing the image, leaving the homogeneous regions strongly smoothed and mainly preserving the physical edges, i.e., those that are actually related to objects presented in the image. The solution for the mentioned duality consists in applying the Canny detector to a fine gaussian scale but only along the edge regions focused by the process of anisotropic diffusion via the PDE. The results have shown that the method is appropriate for applications involving automatic feature extraction, since it allowed the high-precision localization of thinned edges, which are usually related to objects present in the image. © Nauka/Interperiodica 2006.
Resumo:
An important stage in the solution of active vibration control in flexible structures is the optimal placement of sensors and actuators. In many works, the positioning of these devices in systems governed for parameter distributed is, mainly, based, in controllability approach or criteria of performance. The positions that enhance such parameters are considered optimal. These techniques do not take in account the space variation of disturbances. An way to enhance the robustness of the control design would be to locate the actuators considering the space distribution of the worst case of disturbances. This paper is addressed to include in the formulation of problem of optimal location of sensors and piezoelectric actuators the effect of external disturbances. The paper concludes with a numerical simulation in a truss structure considering that the disturbance is applied in a known point a priori. As objective function the C norm system is used. The LQR (Linear Quadratic Regulator) controller was used to quantify performance of different sensors/actuators configurations.
Resumo:
In practical situations, the dynamics of the forcing function on a vibrating system cannot be considered as given a priori, and it must be taken as a consequence of the dynamics of the whole system. In other words, the forcing source has limited power, as that provided by a DC motor for an example, and thus its own dynamics is influenced by that of the vibrating system being forced. This increases the number of degrees of freedom of the problem, and it is called a non-ideal problem. In this work, we considerer two non-ideal problems analyzed by using numerical simulations. The existence of the Sommerfeld effect was verified, that is, the effect of getting stuck at resonance (energy imparted to the DC motor being used to excite large amplitude motions of the supporting structure). We considered two kinds of non-ideal problem: one related to the transverse vibrations of a shaft carrying two disks and another to a piezoceramic bar transducer powered by a vacuum tube generated by a non-ideal source Copyright © 2007 by ASME.
Resumo:
In this paper, some new constraints and an extended formulation are presented for a Lot Sizing and Scheduling Model proposed in the literature. In the production process considered a key material is prepared and is transformed into different final items. The sequencing decisions are related to the order in which the materials are processed and the lot sizing decisions are related to the final items production. The mathematical formulation considers sequence-dependent setup costs and times. Results of the computational tests executed using the software Cplex 10.0 showed that the performance of the branch-and-cut method can be improved by the proposed a priori reformulation.
Resumo:
Although association mining has been highlighted in the last years, the huge number of rules that are generated hamper its use. To overcome this problem, many post-processing approaches were suggested, such as clustering, which organizes the rules in groups that contain, somehow, similar knowledge. Nevertheless, clustering can aid the user only if good descriptors be associated with each group. This is a relevant issue, since the labels will provide to the user a view of the topics to be explored, helping to guide its search. This is interesting, for example, when the user doesn't have, a priori, an idea where to start. Thus, the analysis of different labeling methods for association rule clustering is important. Considering the exposed arguments, this paper analyzes some labeling methods through two measures that are proposed. One of them, Precision, measures how much the methods can find labels that represent as accurately as possible the rules contained in its group and Repetition Frequency determines how the labels are distributed along the clusters. As a result, it was possible to identify the methods and the domain organizations with the best performances that can be applied in clusters of association rules.
Resumo:
Current evidence of phenological responses to recent climate change is substantially biased towards northern hemisphere temperate regions. Given regional differences in climate change, shifts in phenology will not be uniform across the globe, and conclusions drawn from temperate systems in the northern hemisphere might not be applicable to other regions on the planet. We conduct the largest meta-analysis to date of phenological drivers and trends among southern hemisphere species, assessing 1208 long-term datasets from 89 studies on 347 species. Data were mostly from Australasia (Australia and New Zealand), South America and the Antarctic/subantarctic, and focused primarily on plants and birds. This meta-analysis shows an advance in the timing of spring events (with a strong Australian data bias), although substantial differences in trends were apparent among taxonomic groups and regions. When only statistically significant trends were considered, 82% of terrestrial datasets and 42% of marine datasets demonstrated an advance in phenology. Temperature was most frequently identified as the primary driver of phenological changes; however, in many studies it was the only climate variable considered. When precipitation was examined, it often played a key role but, in contrast with temperature, the direction of phenological shifts in response to precipitation variation was difficult to predict a priori. We discuss how phenological information can inform the adaptive capacity of species, their resilience, and constraints on autonomous adaptation. We also highlight serious weaknesses in past and current data collection and analyses at large regional scales (with very few studies in the tropics or from Africa) and dramatic taxonomic biases. If accurate predictions regarding the general effects of climate change on the biology of organisms are to be made, data collection policies focussing on targeting data-deficient regions and taxa need to be financially and logistically supported. © 2013 Chambers et al.