922 resultados para justify
Resumo:
Self-similarity, a concept taken from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical coherence and consistency . This investigation provides a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. This research analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Looking forward in this direction, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature: proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. Using quantitative qualitative methods, evidence is presented to justify a parallel study of different modalities of musical self-similarity. For this purpose, original arguments by Benoît B. Mandelbrot are revised, alongside a systematic critique of the literature on the subject. Furthermore, connecting Charles S. Peirce s synechism with Mandelbrot s fractality is one of the main developments of the present study. This study provides elements for explaining Bolognesi s (1983) conjecture, that states that the most primitive, intuitive and basic musical device is self-reference, extending its functions and operations to self-similar surfaces. In this sense, this research suggests that, with various modalities of self-similarity, synecdochic intersemiosis acts as system of systems in coordination with greater or lesser development of structural consistency, and with a greater or lesser contextual dependence.
Resumo:
The study analyzes the effort to build political legitimacy in the Republic of Turkey by ex-ploring a group of influential texts produced by Kemalist writers. The study explores how the Kemalist regime reproduced certain long-lasting enlightenment meta-narrative in its effort to build political legitimacy. Central in this process was a hegemonic representation of history, namely the interpretation of the Anatolian Resistance Struggle of 1919 1922 as a Turkish Revolution executing the enlightenment in the Turkish nation-state. The method employed in the study is contextualizing narratological analysis. The Kemalist texts are analyzed with a repertoire of concepts originally developed in the theory of narra-tive. By bringing these concepts together with epistemological foundations of historical sciences, the study creates a theoretical frame inside of which it is possible to highlight how initially very controversial historical representations in the end manage to construct long-lasting, emotionally and intellectually convincing bases of national identity for the secular middle classes in Turkey. The two most important explanatory concepts in this sense are di-egesis and implied reader. The diegesis refers to the ability of narrative representation to create an inherently credible story-world that works as the basis of national community. The implied reader refers to the process where a certain hegemonic narrative creates a formula of identification and a position through which any individual real-world reader of a story can step inside the narrative story-world and identify oneself as one of us of the national narra-tive. The study demonstrates that the Kemalist enlightenment meta-narrative created a group of narrative accruals which enabled generations of secular middle classes to internalize Kemalist ideology. In this sense, the narrative in question has not only worked as a tool utilized by the so-called Kemalist state-elite to justify its leadership, but has been internalized by various groups in Turkey, working as their genuine world-view. It is shown in the study that secular-ism must be seen as the core ingredient of these groups national identity. The study proposes that the enlightenment narrative reproduced in the Kemalist ideology had its origin in a simi-lar totalizing cultural narrative created in and for Europe. Currently this enlightenment project is challenged in Turkey by those who are in an attempt to give religion a greater role in Turkish society. The study argues that the enduring practice of legitimizing political power through the enlightenment meta-narrative has not only become a major factor contributing to social polarization in Turkey, but has also, in contradiction to the very real potentials for crit-ical approaches inherent in the Enlightenment tradition, crucially restricted the development of critical and rational modes of thinking in the Republic of Turkey.
Resumo:
The purpose of this master´s thesis is to analyze how NATO Secretary General Anders Fogh Rasmussen is trying to justify the existence of the military alliance through the use of security arguments. I am puzzled by the question: why does NATO still exist – what is NATO’s raison d'être. The New Strategic Concept (2010) forms the base for his argumentation. This thesis focuses on the security argumentation of NATO which is examined by analyzing the speeches the Secretary General. The theoretical framework of this study is based on constructivist approach to international security examining the linguistic process of securitization. Issues become securitized after Anders Fogh Rasmussen names them as threats. This thesis focuses on the securitization process relating to NATO and analyses what issues Rasmussen raises to the security agenda. Research data consists of the speeches by Anders Fogh Rasmussen. They are analyzed through J.L. Austin’s speech act taxonomy and Chaïm Perelman’s argumentation theories. The thesis will concentrate on the formulation and articulation of these threats which are considered and coined as “new threats” in contemporary international relations. I am conducting this research through the use of securitization theory. This study illustrates that the threats are constructed by NATO’s member-states in unison, but the resolutions are sounded through Rasmussen’s official speeches and transcripts. . Based on the analysis it can be concluded that Rasmussen is giving reasons for the existence of NATO. This takes place by making use of speech acts and different rhetorical techniques. The results of the analysis indicate that NATO remains an essential organization for the West and the rest of the world according to the Secretary General.
Resumo:
The well-known linear relationship (T?S# =??H# +?, where 1 >? > 0,? > 0) between the entropy (?S#) and the enthalpy (?H#) of activation for reactions in polar liquids is investigated by using a molecular theory. An explicit derivation of this linear relation from first principles is presented for an outersphere charge transfer reaction. The derivation offers microscopic interpretation for the quantities? and?. It has also been possible to make connection with and justify the arguments of Bell put forward many years ago.
Resumo:
The dynamo effect is used to describe the generation of magnetic fields in astrophysical objects. However, no rigorous derivation of the dynamo equation is available. We justify the form of the equation using an Operator Product Expansion (OPE) of the relevant fields. We also calculate the coefficients of the OPE series using a dynamic renormalisation group approach and discuss the time evolution of the initial conditions on the initial seed magnetic field.
Resumo:
Vehicular ad hoc network (VANET) applications are principally categorized into safety and commercial applications. Efficient traffic management for routing an emergency vehicle is of paramount importance in safety applications of VANETs. In the first case, a typical example of a high dense urban scenario is considered to demonstrate the role of penetration ratio for achieving reduced travel time between source and destination points. The major requirement for testing these VANET applications is a realistic simulation approach which would justify the results prior to actual deployment. A Traffic Simulator coupled with a Network Simulator using a feedback loop feature is apt for realistic simulation of VANETs. Thus, in this paper, we develop the safety application using traffic control interface (TraCI), which couples SUMO (traffic simulator) and NS2 (network simulator). Likewise, the mean throughput is one of the necessary performance measures for commercial applications of VANETs. In the next case, commercial applications have been considered wherein the data is transferred amongst vehicles (V2V) and between roadside infrastructure and vehicles (I2V), for which the throughput is assessed.
Resumo:
In this paper, a C-0 interior penalty method has been proposed and analyzed for distributed optimal control problems governed by the biharmonic operator. The state and adjoint variables are discretized using continuous piecewise quadratic finite elements while the control variable is discretized using piecewise constant approximations. A priori and a posteriori error estimates are derived for the state, adjoint and control variables under minimal regularity assumptions. Numerical results justify the theoretical results obtained. The a posteriori error estimators are useful in adaptive finite element approximation and the numerical results indicate that the sharp error estimators work efficiently in guiding the mesh refinement. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
A deformable mirror (DM) is an important component of an adaptive optics system. It is known that an on-axis spherical/parabolic optical component, placed at an angle to the incident beam introduces defocus as well as astigmatism in the image plane. Although the former can be compensated by changing the focal plane position, the latter cannot be removed by mere optical realignment. Since the DM is to be used to compensate a turbulence-induced curvature term in addition to other aberrations, it is necessary to determine the aberrations induced by such (curved DM surface) an optical element when placed at an angle (other than 0 deg) of incidence in the optical path. To this effect, we estimate to a first order the aberrations introduced by a DM as a function of the incidence angle and deformation of the DM surface. We record images using a simple setup in which the incident beam is reflected by a 37 channel micro-machined membrane deformable mirror for various angles of incidence. It is observed that astigmatism is a dominant aberration, which was determined by measuring the difference between the tangential and sagittal focal planes. We justify our results on the basis of theoretical simulations and discuss the feasibility of using such a system for adaptive optics considering a trade-off between wavefront correction and astigmatism due to deformation. (C) 2015 Optical Society of America
Resumo:
In this discussion, we show that a static definition of a `bond' is not viable by looking at a few examples for both inter-and intra-molecular hydrogen bonding. This follows from our earlier work (Goswami and Arunan, Phys. Chem. Chem. Phys. 2009, 11, 8974) which showed a practical way to differentiate `hydrogen bonding' from `van der Waals interaction'. We report results from ab initio and atoms in molecules theoretical calculations for a series of Rg center dot center dot center dot HX complexes (Rg = He/Ne/Ar and X = F/Cl/Br) and ethane-1,2-diol. Results for the Rg center dot center dot center dot HX/DX complexes show that Rg center dot center dot center dot DX could have a `deuterium bond' even when Rg center dot center dot center dot HX is not `hydrogen bonded', according to the practical criterion given by Goswami and Arunan. Results for ethane-1,2-diol show that an `intra-molecular hydrogen bond' can appear during a normal mode vibration which is dominated by the O center dot center dot center dot O stretching, though a `bond' is not found in the equilibrium structure. This dynamical `bond' formation may nevertheless be important in ensuring the continuity of electron density across a molecule. In the former case, a vibration `breaks' an existing bond and in the later case, a vibration leads to `bond' formation. In both cases, the molecule/complex stays bound irrespective of what happens to this `hydrogen bond'. Both these cases push the borders on the recent IUPAC recommendation on hydrogen bonding (Arunan et al. Pure. Appl. Chem. 2011, 83 1637) and justify the inclusive nature of the definition.
Resumo:
Coarse Grained Reconfigurable Architectures (CGRA) are emerging as embedded application processing units in computing platforms for Exascale computing. Such CGRAs are distributed memory multi- core compute elements on a chip that communicate over a Network-on-chip (NoC). Numerical Linear Algebra (NLA) kernels are key to several high performance computing applications. In this paper we propose a systematic methodology to obtain the specification of Compute Elements (CE) for such CGRAs. We analyze block Matrix Multiplication and block LU Decomposition algorithms in the context of a CGRA, and obtain theoretical bounds on communication requirements, and memory sizes for a CE. Support for high performance custom computations common to NLA kernels are met through custom function units (CFUs) in the CEs. We present results to justify the merits of such CFUs.
Resumo:
A new method of selection of time-to-go (t(go)) for Generalized Vector Explicit Guidance (GENEX) law have been proposed in this paper. t(go) is known to be an important parameter in the control and cost function of GENEX guidance law. In this paper the formulation has been done to find an optimal value of t(go) that minimizes the performance cost. Mechanization of GENEX with this optimal t(go) reduces the lateral acceleration demand and consequently increases the range of the interceptor. This new formulation of computing t(go) comes in closed form and thus it can be implemented onboard. This new formulation is applied in the terminal phase of an surface-to-air interceptor for an angle constrained engagement. Results generated by simulation justify the use of optimal t(go).
Resumo:
This paper studies the effect of fissure water pressure in different fractures on the critical angle of landslide by laboratory investigation and numerical simulation in order to understand the mechanisms of fissure water pressure on landslide stability. Laboratory observations show that the effect of fissure water pressure on the critical angle of landslide is little when the distance between water-holding fracture and slope toe is three times greater than the depth of fissure water. These experimental results are also simulated by a three-dimensional face-to-face contact discrete element method. This method has included the fissure water pressure and can accurately calculate the critical angle of jointed slope when fissure water pressure in vertical sliding surface exists. Numerical results are in good agreement with experimental observations. It is revealed that the location of water-holding structural surface is important to landslide stability. The ratio of the distance between water-holding fissure and slope toe to the depth of fissure water is a key parameter to justify the effect of fissure water pressure on the critical angle of landslide.
Resumo:
Large strain finite element method is employed to investigate the effect of straining mode on void growth. Axisymmetric cell model embedded with spherical void is controlled by constant triaxiality: loading,while plane-stress model containing a circular void is loaded by constant ratio of straining. Elastic-plastic material is used for the matrix in both cases. It is concluded that, besides the known effect of triaxiality, the straining mode which intensifies the plastic concentration around the void is also a void growth stimulator. Experimental results are cited to justify the computation results.
Resumo:
Damage evolution of heterogeneous brittle media involves a wide range of length scales. The coupling between these length scales underlies the mechanism of damage evolution and rupture. However, few of previous numerical algorithms consider the effects of the trans-scale coupling effectively. In this paper, an adaptive mesh refinement FEM algorithm is developed to simulate this trans-scale coupling. The adaptive serendipity element is implemented in this algorithm, and several special discontinuous base functions are created to avoid the incompatible displacement between the elements. Both the benchmark and a typical numerical example under quasi-static loading are given to justify the effectiveness of this model. The numerical results reproduce a series of characteristics of damage and rupture in heterogeneous brittle media.
Resumo:
Shape Memory Alloy (SMA) can be easily deformed to a new shape by applying a small external load at low temperature, and then recovers its original configuration upon heating. This unique shape memory phenomenon has inspired many novel designs. SMA based heat engine is one among them. SMA heat engine is an environment-friendly alternative to extract mechanical energy from low-grade energies, for instance, warm wastewater, geothermal energy, solar thermal energy, etc. The aim of this paper is to present an applicable theoretical model for simulation of SMA-based heat engines. First, a micro-mechanical constitutive model is derived for SMAs. The volume fractions of austenite and martensite variants are chosen as internal variables to describe the evolution of microstructure in SMA upon phase transition. Subsequently, the energy equation is derived based on the first thermodynamic law and the previous SMA model. From Fourier’s law of heat conduction and Newton’s law of cooling, both differential and integral forms of energy conversion equation are obtained.