994 resultados para Geometry concept


Relevância:

60.00% 60.00%

Publicador:

Resumo:

O artigo pretende mostrar os resultados de uma pesquisa cujo objetivo foi analisar o desempenho de alunos do Ensino Médio na identificação de exemplos e não-exemplos de polígonos e poliedros, tendo em vista os atributos definidores e atributos irrelevantes. Participaram da pesquisa 253 alunos, distribuídos em três séries do Ensino Médio de uma escola pública, que responderam um teste de exemplos e não-exemplos. Posteriormente, foram selecionados seis alunos para serem entrevistados sobre seus conhecimentos a respeito dos exemplos e não-exemplos. Os resultados mostraram que os participantes obtiveram uma média baixa (M = 5,59) na tarefa que exigiu a identificação de exemplos e não-exemplos de polígonos e poliedros. A análise das entrevistas mostrou que a maioria dos alunos considerou que o atributo irrelevante dos polígonos analisados não interferiu na resposta. Contudo, evidencia-se, de modo geral, que esses alunos apresentaram um conhecimento conceitual longe do esperado para esse nível de ensino.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the late 20th century it was proposed that a design aesthetic reflecting current ecological concerns was required within the overall domain of the built environment and specifically within landscape design. To address this, some authors suggested various theoretical frameworks upon which such an aesthetic could be based. Within these frameworks there was an underlying theme that the patterns and processes of Nature may have the potential to form this aesthetic — an aesthetic based on fractal rather than Euclidean geometry. In order to understand how fractal geometry, described as the geometry of Nature, could become the referent for a design aesthetic, this research examines the mathematical concepts of fractal Geometry, and the underlying philosophical concepts behind the terms ‘Nature’ and ‘aesthetics’. The findings of this initial research meant that a new definition of Nature was required in order to overcome the barrier presented by the western philosophical Nature¯culture duality. This new definition of Nature is based on the type and use of energy. Similarly, it became clear that current usage of the term aesthetics has more in common with the term ‘style’ than with its correct philosophical meaning. The aesthetic philosophy of both art and the environment recognises different aesthetic criteria related to either the subject or the object, such as: aesthetic experience; aesthetic attitude; aesthetic value; aesthetic object; and aesthetic properties. Given these criteria, and the fact that the concept of aesthetics is still an active and ongoing philosophical discussion, this work focuses on the criteria of aesthetic properties and the aesthetic experience or response they engender. The examination of fractal geometry revealed that it is a geometry based on scale rather than on the location of a point within a three-dimensional space. This enables fractal geometry to describe the complex forms and patterns created through the processes of Wild Nature. Although fractal geometry has been used to analyse the patterns of built environments from a plan perspective, it became clear from the initial review of the literature that there was a total knowledge vacuum about the fractal properties of environments experienced every day by people as they move through them. To overcome this, 21 different landscapes that ranged from highly developed city centres to relatively untouched landscapes of Wild Nature have been analysed. Although this work shows that the fractal dimension can be used to differentiate between overall landscape forms, it also shows that by itself it cannot differentiate between all images analysed. To overcome this two further parameters based on the underlying structural geometry embedded within the landscape are discussed. These parameters are the Power Spectrum Median Amplitude and the Level of Isotropy within the Fourier Power Spectrum. Based on the detailed analysis of these parameters a greater understanding of the structural properties of landscapes has been gained. With this understanding, this research has moved the field of landscape design a step close to being able to articulate a new aesthetic for ecological design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the conventional MOSFET's scaling is approaching the limit imposed by short channel effects, Double Gate (DG) MOS transistors are appearing as the most feasible candidate in terms of technology in sub-45nm technology nodes. As the short channel effect in DG transistor is controlled by the device geometry, undoped or lightly doped body is used to sustain the channel. There exits a disparity in threshold voltage calculation criteria of undoped-body symmetric double gate transistors which uses two definitions, one is potential based and the another is charge based definition. In this paper, a novel concept of "crossover point'' is introduced, which proves that the charge-based definition is more accurate than the potential based definition.The change in threshold voltage with body thickness variation for a fixed channel length is anomalous as predicted by potential based definition while it is monotonous for charge based definition.The threshold voltage is then extracted from drain currant versus gate voltage characteristics using linear extrapolation and "Third Derivative of Drain-Source Current'' method or simply "TD'' method. The trend of threshold voltage variation is found same in both the cases which support charge-based definition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the conventional MOSFETs scaling is approaching the limit imposed by short channel effects, Double Gate (DG) MOS transistors are appearing as the most feasible andidate in terms of technology in sub-45nm technology nodes. As the short channel effect in DG transistor is controlled by the device geometry, undoped or lightly doped body, is used to sustain the channel. There exits a disparity in threshold voltage calculation criteria of undoped-body symmetric double gate transistors which uses two definitions, one is potential based and the another is charge based definition. In this paper, a novel concept of "crossover point" is introduced, which proves that the charge-based definition is more accurate than the potential based definition. The change in threshold voltage with body thickness variation for a fixed channel length is anomalous as predicted by, potential based definition while it is monotonous for change based definition. The threshold voltage is then extracted from drain currant versus gate voltage characteristics using linear extrapolation and "Third Derivative of Drain-Source Current" method or simply "TD" method. The trend of threshold voltage variation is found some in both the cases which support charge-based definition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sparking potentials in a coaxial cylinder geometry in oxygen and dry air were measured in crossed electric and magnetic fields. From the data effective collision frequencies were calculated using the equivalent pressure concept. It is shown that the equivalent pressure concept holds good for deriving the effective collision frequencies in non-uniform electric fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extended x-ray absorption fine-structure studies have been performed at the Zn K and Cd K edges for a series of solid solutions of wurtzite Zn1-xCdxS samples with x = 0.0, 0.1, 0.25, 0.5, 0.75, and 1.0, where the lattice parameter as a function of x evolves according to the well-known Vegard's law. In conjunction with extensive, large-scale first-principles electronic structure calculations with full geometry optimizations, these results establish that the percentage variation in the nearest-neighbor bond distances are lower by nearly an order of magnitude compared to what would be expected on the basis of lattice parameter variation, seriously undermining the chemical pressure concept. With experimental results that allow us to probe up to the third coordination shell distances, we provide a direct description of how the local structure, apparently inconsistent with the global structure, evolves very rapidly with interatomic distances to become consistent with it. We show that the basic features of this structural evolution with the composition can be visualized with nearly invariant Zn-S-4 and Cd-S-4 tetrahedral units retaining their structural integrity, while the tilts between these tetrahedral building blocks change with composition to conform to the changing lattice parameters according to the Vegard's law within a relatively short length scale. These results underline the limits of applicability of the chemical pressure concept that has been a favored tool of experimentalists to control physical properties of a large variety of condensed matter systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tectogene, or crustal downbuckle, was proposed in the early 1930s by F.A. Vening Meinesz to explain the unexpected belts of negative gravity anomalies in island arcs. He attributed the isostatic imbalance to a deep sialic root resulting from the action of subcrustal convection currents. Vening Meinesz's model was initially corroborated experimentally by P.H. Kuenen, but additional experiments by D.T. Griggs and geological analysis by H.H. Hess in the late 1930s led to substantial revision in detail. As modified, the tectogene provided a plausible model for the evolution of island arcs into alpine mountain belts for another two decades. Additional revisions became necessary in the early 1950s to accommodate the unexpected absence of sialic crust in the Caribbean and the marginal seas of the western Pacific. By 1960 the cherished analogy between island arcs and alpine mountain belts had collapsed under the weight of the detailed field investigations by Hess and his students in the Caribbean region. Hess then incorporated a highly modified form of the tectogene into his sea-floor spreading hypothesis. Ironically, this final incarnation of the concept preserved some of the weaker aspects of the 1930s original, such as the ad hoc explanation for the regular geometry of island arcs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the difficulties in model testing deepwater structures at reasonable scales. An overview of recent research efforts to tackle this challenge is given first, introducing the concept of line truncation. Passive truncation has traditionally been the preferred method by industry; however, these techniques tend to suffer in capturing accurately line dynamic response and so reproducing peak tensions. In an attempt to improve credibility of model test data the proposed truncation procedure sets up the truncated model, based on line dynamic response rather than quasi-static system stiffness. Vibration decay of transverse elastic waves due to fluid drag forces is assessed and it is found that below a certain length criterion, the transverse vibrational characteristics for each line are inertia driven, hence with respect to these motions the truncated model can assume a linear damper whose coefficient depends on the local line properties and vibration frequency. Initially a simplified taut string model is assumed for which the line is submerged in still water, one end fixed at the bottom the other assumed to follow the vessel response, which can be harmonic or random. A dimensional analysis, supported by exact benchmark numerical solutions, has shown that it is possible to produce a general guideline for the truncation length criterion, which is suitable for any kind of line with any top motion. The focus of this paper is to extend this work to a more complex line configuration of a conventional deepwater mooring line and so enhance the generality of the truncation guideline. The paper will close with an example case study of a spread mooring system, applying this method to create an equivalent numerical model at a reduced depth that replicates exactly the static and dynamic characteristics of the full depth system. Copyright © 2012 by ASME.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study relates tidal channel cross-sectional area (A) to peak spring discharge (Q) via a physical mechanism, namely the stability shear stress ( tau sub(S)) just necessary to maintain a zero gradient in net along-channel sediment transport. It is assumed that if bed shear stress ( tau ) is greater than tau sub(S), net erosion will occur, increasing A, and reducing tau similar to (Q/A) super(2) back toward tau sub(S). If tau < tau sub(S) there will be net deposition, reducing A and increasing tau toward tau sub(S). A survey of the literature allows estimates of Q and A at 242 sections in 26 separate sheltered tidal systems. Assuming a single value of tau sub(S) characterizes the entire length of a given tidal channel, it is predicted that along-channel geometry will follow the relation Ah sub(R) super(1) super(/) super(6) similar to Q. Along-channel regressions of the form Ah sub(R) super(1) super(/) super(6) similar to Q super( beta ) give a mean observed value for beta of 1.00 plus or minus 0.06, which is consistent with this concept. Results indicate that a lower bound on tau sub(S) (and an upper bound on A) for stable channels is provided by the critical shear stress ( tau sub(C)) just capable of initiating sediment motion. Observed tau sub(S) is found to vary among all systems as a function of spring tidal range (R sub(sp)) according to the relation tau sub(S) approximately 2.3 R sub(sp) super(0.79) tau sub(C). Observed deviations from uniform tau sub(S) along individual channels are associated with along-channel variation in the direction of maximum discharge (i.e., flood-versus ebb-dominance).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tube diameter in the reptation model is the distance between a given chain segment and its nearest segment in adjacent chains. This dimension is thus related to the cross-sectional area of polymer chains and the nearest approach among chains, without effects of thermal fluctuation and steric repulsion. Prior calculated tube diameters are much larger, about 5 times, than the actual chain cross-sectional areas. This is ascribed to the local freedom required for mutual rearrangement among neighboring chain segments. This tube diameter concept seems to us to infer a relationship to the corresponding entanglement spacing. Indeed, we report here that the critical molecular weight, M(c), for the onset of entanglements is found to be M(c) = 28 A/([R2]0/M), where A is the chain cross-sectional area and [R2]0 the mean-square end-to-end distance of a freely jointed chain of molecular weight M. The new, computed relationship between the critical number of backbone atoms for entanglement and the chain cross-sectional area of polymers, N(c) = A0,44, is concordant with the cross-sectional area of polymer chains being the parameter controlling the critical entanglement number of backbone atoms of flexible polymers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser accelerated proton beams have been proposed to be used in different research fields. A great interest has risen for the potential replacement of conventional accelerating machines with laser-based accelerators, and in particular for the development of new concepts of more compact and cheaper hadrontherapy centers. In this context the ELIMED (ELI MEDical applications) research project has been launched by INFN-LNS and ASCR-FZU researchers within the pan-European ELI-Beamlines facility framework. The ELIMED project aims to demonstrate the potential clinical applicability of optically accelerated proton beams and to realize a laser-accelerated ion transport beamline for multi-disciplinary user applications. In this framework the eye melanoma, as for instance the uveal melanoma normally treated with 62 MeV proton beams produced by standard accelerators, will be considered as a model system to demonstrate the potential clinical use of laser-driven protons in hadrontherapy, especially because of the limited constraints in terms of proton energy and irradiation geometry for this particular tumour treatment. Several challenges, starting from laser-target interaction and beam transport development up to dosimetry and radiobiology, need to be overcome in order to reach the ELIMED final goals. A crucial role will be played by the final design and realization of a transport beamline capable to provide ion beams with proper characteristics in terms of energy spectrum and angular distribution which will allow performing dosimetric tests and biological cell irradiation. A first prototype of the transport beamline has been already designed and other transport elements are under construction in order to perform a first experimental test with the TARANIS laser system by the end of 2013. A wide international collaboration among specialists of different disciplines like Physics, Biology, Chemistry, Medicine and medical doctors coming from Europe, Japan, and the US is growing up around the ELIMED project with the aim to work on the conceptual design, technical and experimental realization of this core beamline of the ELI Beamlines facility. © 2013 SPIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La thèse présente une analyse conceptuelle de l'évolution du concept d'espace topologique. En particulier, elle se concentre sur la transition des espaces topologiques hérités de Hausdorff aux topos de Grothendieck. Il en ressort que, par rapport aux espaces topologiques traditionnels, les topos transforment radicalement la conceptualisation topologique de l'espace. Alors qu'un espace topologique est un ensemble de points muni d'une structure induite par certains sous-ensembles appelés ouverts, un topos est plutôt une catégorie satisfaisant certaines propriétés d'exactitude. L'aspect le plus important de cette transformation tient à un renversement de la relation dialectique unissant un espace à ses points. Un espace topologique est entièrement déterminé par ses points, ceux-ci étant compris comme des unités indivisibles et sans structure. L'identité de l'espace est donc celle que lui insufflent ses points. À l'opposé, les points et les ouverts d'un topos sont déterminés par la structure de celui-ci. Qui plus est, la nature des points change: ils ne sont plus premiers et indivisibles. En effet, les points d'un topos disposent eux-mêmes d'une structure. L'analyse met également en évidence que le concept d'espace topologique évolua selon une dynamique de rupture et de continuité. Entre 1945 et 1957, la topologie algébrique et, dans une certaine mesure, la géométrie algébrique furent l'objet de changements fondamentaux. Les livres Foundations of Algebraic Topology de Eilenberg et Steenrod et Homological Algebra de Cartan et Eilenberg de même que la théorie des faisceaux modifièrent profondément l'étude des espaces topologiques. En contrepartie, ces ruptures ne furent pas assez profondes pour altérer la conceptualisation topologique de l'espace elle-même. Ces ruptures doivent donc être considérées comme des microfractures dans la perspective de l'évolution du concept d'espace topologique. La rupture définitive ne survint qu'au début des années 1960 avec l'avènement des topos dans le cadre de la vaste refonte de la géométrie algébrique entreprise par Grothendieck. La clé fut l'utilisation novatrice que fit Grothendieck de la théorie des catégories. Alors que ses prédécesseurs n'y voyaient qu'un langage utile pour exprimer certaines idées mathématiques, Grothendieck l'emploie comme un outil de clarification conceptuelle. Ce faisant, il se trouve à mettre de l'avant une approche axiomatico-catégorielle des mathématiques. Or, cette rupture était tributaire des innovations associées à Foundations of Algebraic Topology, Homological Algebra et la théorie des faisceaux. La théorie des catégories permit à Grothendieck d'exploiter le plein potentiel des idées introduites par ces ruptures partielles. D'un point de vue épistémologique, la transition des espaces topologiques aux topos doit alors être vue comme s'inscrivant dans un changement de position normative en mathématiques, soit celui des mathématiques modernes vers les mathématiques contemporaines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le problème de l’extrême pauvreté dans le Tiers-monde n’est pas d’abord une question économique. Il est avant tout politique parce qu’il est la conséquence directe des choix de société et de l’organisation du pouvoir au niveau des États et des diverses instances de la communauté internationale. Le politique a pour objet la conquête du pouvoir et la répartition des richesses à grande échelle. Il s’agit aussi d’un problème moral parce que les options prises collectivement par les peuples et le concert des nations ne s’orientent pas toujours vers la vertu de justice et l’égalité de chances pour tous. Extrême pauvreté et justice globale forment un binôme qui nous ramène donc au cœur de la philosophie politique et morale. Après la Seconde guerre mondiale, la philosophie politique a élargi ses horizons. Elle réfléchit davantage à l’exercice du pouvoir sur la scène internationale et la distribution des richesses au niveau mondial. Le phénomène de la mondialisation économique crée une dépendance mutuelle et d’importantes influences multilatérales entre les États. Plus que par le passé, l’autarcie n’est guère envisageable. Le dogme de la souveraineté intangible des États, issu du Traité de Westphalie au XVIIe siècle, s’avère de plus en plus caduque au regard des enjeux communs auxquels l’humanité fait actuellement face. D’où la nécessité d’une redéfinition du sens des souverainetés nationales et d’une fondation des droits cosmopolitiques pour chaque individu de la planète. Voilà pourquoi le binôme extrême pauvreté/justice globale nécessite une réflexion philosophique sur le concept de la responsabilité qui s’étend non seulement sur la sphère nationale, mais aussi sur une large amplitude cosmopolitique. L’expression « pays du Tiers-monde » peut sembler archaïque, péjorative et humiliante. Cependant, mieux que celles de « pays sous-développés » ou « pays en voie de développement », elle rend compte, sans euphémisme, de la réalité crue, brute et peu élégante de la misère politique et économique qui y sévit. Bien qu’elle semble désuète, elle délimite assez clairement le domaine de définition conceptuel et géographique de notre champ d’investigation philosophique. Elle désigne l’ensemble des pays qui sont exclus de la richesse économique répartie entre les nations. Étant donné que le pouvoir économique va généralement avec le pouvoir politique, cet ensemble est aussi écarté des centres décisionnels majeurs. Caractérisée par une pauvreté extrême, la réalité tiers-mondiste nécessité une analyse minutieuse des causes de cette marginalisation économique et politique à outrance. Une typologie de la notion de responsabilité en offre une figure conceptuelle avec une géométrie de six angles : la causalité, la moralité, la capacité, la communauté, le résultat et la solidarité, comme fondements de la réparation. Ces aspects sous lesquels la responsabilité est étudiée, sont chapeautés par des doctrines philosophiques de types conséquentialiste, utilitariste, déontologique et téléologique. La typologie de la responsabilité donne lieu à plusieurs solutions : aider par philanthropie à sauver des vies humaines ; établir et assigner des responsabilités afin que les torts passés et présents soient réparés aussi bien au niveau national qu’international ; promouvoir l’obligation de protéger dans un contexte international sain qui prenne en considération le devoir négatif de ne pas nuire aux plus défavorisés de la planète ; institutionnaliser des règles transfrontalières de justice ainsi que des droits cosmopolitiques. Enfin, nous entendrons par omniresponsabilité la responsabilité de tous vis-à-vis de ceux qui subissent les affres de l’extrême pauvreté dans le Tiers-monde. Loin d’être un concept-valise fourre-tout, c’est un ensemble de responsabilités partagées par des acteurs identifiables de la scène mondiale, en vue de la coréparation due aux victimes de l’injustice globale. Elle vise un telos : l’épanouissement du bien-être du citoyen du monde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La division cellulaire asymétrique (DCA) consiste en une division pendant laquelle des déterminants cellulaires sont distribués préférentiellement dans une des deux cellules filles. Par l’action de ces déterminants, la DCA générera donc deux cellules filles différentes. Ainsi, la DCA est importante pour générer la diversité cellulaire et pour maintenir l’homéostasie de certaines cellules souches. Pour induire une répartition asymétrique des déterminants cellulaires, le positionnement du fuseau mitotique doit être très bien contrôlé. Fréquemment ceci génère deux cellules filles de tailles différentes, car le fuseau mitotique n’est pas centré pendant la mitose, ce qui induit un positionnement asymétrique du sillon de clivage. Bien qu’un complexe impliquant des GTPases hétérotrimériques et des protéines liant les microtubules au cortex ait été impliqué directement dans le positionnement du fuseau mitotique, le mécanisme exact induisant le positionnement asymétrique du fuseau durant la DCA n'est pas encore compris. Des études récentes suggèrent qu’une régulation asymétrique du cytosquelette d’actine pourrait être responsable de ce positionnement asymétrique du faisceau mitotique. Donc, nous émettons l'hypothèse que des contractions asymétriques d’actine pendant la division cellulaire pourraient déplacer le fuseau mitotique et le sillon de clivage pour créer une asymétrie cellulaire. Nos résultats préliminaires ont démontré que le blebbing cortical, qui est une indication de tension corticale et de contraction, se produit préférentiellement dans la moitié antérieure de cellule précurseur d’organes sensoriels (SOP) pendant le stage de télophase. Nos données soutiennent l'idée que les petites GTPases de la famille Rho pourraient être impliqués dans la régulation du fuseau mitotique et ainsi contrôler la DCA des SOP. Les paramètres expérimentaux développés pour cette thèse, pour étudier la régulation de l’orientation et le positionnement du fuseau mitotique, ouvrirons de nouvelles avenues pour contrôler ce processus, ce qui pourrait être utile pour freiner la progression de cellules cancéreuses. Les résultats préliminaires de ce projet proposeront une manière dont les petites GTPases de la famille Rho peuvent être impliqués dans le contrôle de la division cellulaire asymétrique in vivo dans les SOP. Les modèles théoriques qui sont expliqués dans cette étude pourront servir à améliorer les méthodes quantitatives de biologie cellulaire de la DCA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is an attempt to initiate the development of a discrete geometry of the discrete plane H = {(qmxo,qnyo); m,n e Z - the set of integers}, where q s (0,1) is fixed and (xO,yO) is a fixed point in the first quadrant of the complex plane, xo,y0 ¢ 0. The discrete plane was first considered by Harman in 1972, to evolve a discrete analytic function theory for geometric difference functions. We shall mention briefly, through various sections, the principle of discretization, an outline of discrete a alytic function theory, the concept of geometry of space and also summary of work done in this thesis