820 resultados para contours
Resumo:
The offered paper deals with the problems of color images preliminary procession. Among these are: interference control (local ones and noise) and extraction of the object from the background on the stage preceding the process of contours extraction. It was considered for a long time that execution of smoothing in segmentation through the boundary extraction is inadmissible, but the described methods and the obtained results evidence about expedience of using the noise control methods.
Resumo:
Recent research suggests that the ability of an extraneous formant to impair intelligibility depends on the variation of its frequency contour. This idea was explored using a method that ensures interference cannot occur through energetic masking. Three-formant (F1+F2+F3) analogues of natural sentences were synthesized using a monotonous periodic source. Target formants were presented monaurally, with the target ear assigned randomly on each trial. A competitor for F2 (F2C) was presented contralaterally; listeners must reject F2C to optimize recognition. In experiment 1, F2Cs with various frequency and amplitude contours were used. F2Cs with time-varying frequency contours were effective competitors; constant-frequency F2Cs had far less impact. To a lesser extent, amplitude contour also influenced competitor impact; this effect was additive. In experiment 2, F2Cs were created by inverting the F2 frequency contour about its geometric mean and varying its depth of variation over a range from constant to twice the original (0%-200%). The impact on intelligibility was least for constant F2Cs and increased up to ∼100% depth, but little thereafter. The effect of an extraneous formant depends primarily on its frequency contour; interference increases as the depth of variation is increased until the range exceeds that typical for F2 in natural speech.
Resumo:
An important aspect of speech perception is the ability to group or select formants using cues in the acoustic source characteristics-for example, fundamental frequency (F0) differences between formants promote their segregation. This study explored the role of more radical differences in source characteristics. Three-formant (F1+F2+F3) synthetic speech analogues were derived from natural sentences. In Experiment 1, F1+F3 were generated by passing a harmonic glottal source (F0 = 140 Hz) through second-order resonators (H1+H3); in Experiment 2, F1+F3 were tonal (sine-wave) analogues (T1+T3). F2 could take either form (H2 or T2). In some conditions, the target formants were presented alone, either monaurally or dichotically (left ear = F1+F3; right ear = F2). In others, they were accompanied by a competitor for F2 (F1+F2C+F3; F2), which listeners must reject to optimize recognition. Competitors (H2C or T2C) were created using the time-reversed frequency and amplitude contours of F2. Dichotic presentation of F2 and F2C ensured that the impact of the competitor arose primarily through informational masking. In the absence of F2C, the effect of a source mismatch between F1+F3 and F2 was relatively modest. When F2C was present, intelligibility was lowest when F2 was tonal and F2C was harmonic, irrespective of which type matched F1+F3. This finding suggests that source type and context, rather than similarity, govern the phonetic contribution of a formant. It is proposed that wideband harmonic analogues are more effective informational maskers than narrowband tonal analogues, and so become dominant in across-frequency integration of phonetic information when placed in competition.
Resumo:
Recent research suggests that the ability of an extraneous formant to impair intelligibility depends on the variation of its frequency contour. This idea was explored using a method that ensures interference occurs only through informational masking. Three-formant analogues of sentences were synthesized using a monotonous periodic source (F0 = 140 Hz). Target formants were presented monaurally; the target ear was assigned randomly on each trial. A competitor for F2 (F2C) was presented contralaterally; listeners must reject F2C to optimize recognition. In experiment 1, F2Cs with various frequency and amplitude contours were used. F2Cs with time-varying frequency contours were effective competitors; constant-frequency F2Cs had far less impact. Amplitude contour also influenced competitor impact; this effect was additive. In experiment 2, F2Cs were created by inverting the F2 frequency contour about its geometric mean and varying its depth of variation over a range from constant to twice the original (0–200%). The impact on intelligibility was least for constant F2Cs and increased up to ~100% depth, but little thereafter. The effect of an extraneous formant depends primarily on its frequency contour; interference increases as the depth of variation is increased until the range exceeds that typical for F2 in natural speech.
Resumo:
The article first gives an overview of the formation and the evolution of the principle of non-refoulement under international law. The different meanings of the concept in the asylum and human rights contexts are then discussed and compared, with due regard to the convergences that arose in the course of legal developments. In doing so, this short piece also draws attention to certain controversial issues and blurred lines, which have surfaced through the practical application of the prohibition of refoulement. Identifying the contours of the concept and clarifying its content and its effects may help in appreciating the implications that stem, in the current extraordinary times of migratory movements, from the fundamental humanitarian legal principles of which the imperative of non-refoulement forms part.
Resumo:
This study examines the contours of Turkish-American foreign relations in the post-Cold War era from 1990 to 2005. While providing an interpretive analysis, the study highlights elements of continuity and change and of convergence and divergence in the relationship between Ankara and Washington. Turkey’s encounter with its Kurdish problem at home intertwined with the emergence of an autonomous Kurdish authority in northern Iraq after the Gulf War that left a political vacuum in the region. The main argument of this dissertation is that the Kurdish question has been the central element in shaping and redefining the nature and scope of Turkish-American relations since 1991. This study finds that systemic factors primarily prevail in the early years of the post-Cold War Turkish-American relations, as had been the case during the Cold War era. However, the Turkish parliament’s rejection of the deployment of the U.S. troops in Turkey for the invasion of Iraq in 2003 could not be explained by the primacy of distribution of capabilities in the system. Instead, the role of identity, ideology, norms, and the socialization of agency through interaction and language must be considered. The Justice and Development Party’s ascension to power in 2002 magnified a wider transformation in domestic and foreign politics and reflected changes in Turkey’s own self-perception and the definition of its core interests towards the United States.
Resumo:
What actors and processes at what levels of analysis and through what mechanisms have pushed Iran's nuclear program (INP) towards being designated as a proliferation threat (securitization)? What actors and processes at what levels of analysis and through what mechanisms have pushed Iran's nuclear program away from being designated as an existential threat (de-securitization)? What has been the overall balance of power and interaction dynamics of these opposing forces over the last half-century and what is their most likely future trajectory? ^ Iran's nuclear story can be told as the unfolding of constant interaction between state and non-state forces of "nuclear securitization" and "nuclear de-securitization." Tracking the crisscrossing interaction between these different securitizing and de-securitizing actors in a historical context constitutes the central task of this project. ^ A careful tracing of "security events" on different analytical levels reveals the broad contours of the evolutionary trajectory of INP and its possible future path(s). Out of this theoretically conscious historical narrative, one can make informed observations about the overall thrust of INP along the securitization - de-securitization continuum. ^ The main contributions of this work are three fold: First, it brings a fresh theoretical perspective on Iran's proliferation behavior by utilizing the "securitization" theory tracing the initial indications of the threat designation of INP all the way back to the mid 1970s. Second, it gives a solid and thematically grounded historical texture to INP by providing an intimate engagement with the persons, processes, and events of Tehran's nuclear pursuit over half a century. Third, it demonstrates how INP has interacted with and even at times transformed the NPT as the keystone of the non-proliferation regime, and how it has affected and injected urgency to the international discourse on nuclear proliferation specifically in the Middle East.^
Resumo:
Posing radical challenges to structural inequality is the defining quality of the Left. What role electoral politics might play in such processes is a dilemma of radical politics, the contours of which vary by historical and national contexts. For the U.S. Left there is a distinctive aspect of the dilemma directly related to the failure of a "Left" party of even the most moderate social democratic type to take root, creating a seemingly never ending debate over the value if any of "third party" progressive organizing. This debate is current, as illustrated by three divergent approaches; independent left electoral politics (Socialist Alternative), organizing within the less conservative of the dominant parties (Progressive Democrats of America), and a social movement focus outside the electoral process (Occupy Movement). The present day examples of alternative Left strategies noted here in passing are but three of many such specific organizational options for progressive politics. This article does not seek to advocate for any one of these options to the exclusion of the others but rather seeks to provide historical perspective.
Resumo:
The aim of the thesis is to develop a critique of current liberal conceptualizations of international order. In order to conduct this critique, this thesis revisits the arguments first put forth by the German legal and political theorist Carl Schmitt. Schmitt conceptualizes a tripartite unity between law, order, and place. This unity, established at the constituent moment of land-appropriation, forms a concrete nomos, which subsequently creates the contours of the legal and political order. The establishment of the concrete order is necessarily the construction of a territorial boundary that designates an inside and an outside of the polity. By speaking of a nomos of the earth, Schmitt globalized this understanding of concrete order by looking at the various historical developments that created a "line" between the concrete applicability of interstate norms and a region where the exceptional situation prevails. The critique presented in this thesis is concerned with the lack of concrete boundary conditions within the current international legal order. It is argued that this lack of a well-defined boundary condition is what results in extreme forms of violence that were traditionally bracketed.
Resumo:
While essential to human nature, health and life have been protected since ancient times by various areas of knowledge, particularly by the Law, given its dynamics within the regulation of social interactions. In Brazil, health has been granted major importance by the Federal Constitution of 1988, which, disrupting the dictatorial authoritarianism, inaugurating a Social State and focusing on the values of freedom and human dignity, raises health to the condition of a social right, marked predominantly by an obligational bias directed, primarily, to the State, through the enforcement of public policies. Although, given the limitation of the State action to the reserve for contingencies, it turns clear that an universalizing access to public health is impossible, seen that the high cost of medical provisions hinders the State to meet all the health needs of the rightholders. As a result of the inefficiency of the State, the effort of the Constituent Assembly of 1988 in creating a hybrid health system becomes nuclear, which, marked by the possibility of exploration of healthcare by the private initiative, assigns to the private enterprise a key role in supplementing the public health system, especially through the offer of health insurance plans. At this point, however, it becomes clear that health provisions rendered by the private agents are not unlimited, which involves discussions about services and procedures that should be excluded from the contractual coverage, for purposes of sectoral balance, situation which draws the indispensability of deliberations between Fundamental Rights on one hand, related to the protection of health and life, and contractual principles on the other hand, connected to the primacy of private autonomy. At this point, the importance of the regulation undertaken by the ANS, Brazilian National Health Agency, appears primordial, which, by means of its seized broad functions, considerable autonomy and technical discretion, has conditions to implement an effective control towards the harmonization of the regulatory triangle, the stability and development of the supplementary health system and, consequently, towards the universalization of the right to health, within constitutional contours. According to this, the present essay, resorting to a broad legislative, doctrinal and jurisprudential study, concludes that economic regulation over the private healthcare sector, when legitimately undertaken, provides progress and stability to the intervening segment and, besides, turns healthcare universalization feasible, in a way that it can not be replaced efficiently by any other State function.
Resumo:
The purpose of this research is to analyze different daylighting systems in schools in the city of Natal/RN. Although with the abundantly daylight available locally, there are a scarce and diffuse architectural recommendations relating sky conditions, dimensions of daylight systems, shading, fraction of sky visibility, required illuminance, glare, period of occupation and depth of the lit area. This research explores different selected apertures systems to explore the potential of natural light for each system. The method has divided into three phases: The first phase is the modeling which involves the construction of three-dimensional model of a classroom in Sketchup software 2014, which is featured in follow recommendations presented in the literature to obtain a good quality of environmental comfort in school settings. The second phase is the dynamic performance computer simulation of the light through the Daysim software. The input data are the climate file of 2009 the city of Natal / RN, the classroom volumetry in 3ds format with the assignment of optical properties of each surface, the sensor mapping file and the user load file . The results produced in the simulation are organized in a spreadsheet prepared by Carvalho (2014) to determine the occurrence of useful daylight illuminance (UDI) in the range of 300 to 3000lux and build graphics illuminance curves and contours of UDI to identify the uniformity of distribution light, the need of the minimum level of illuminance and the occurrence of glare.
Desenvolvimento da célula base de microestruturas periódicas de compósitos sob otimização topológica
Resumo:
This thesis develops a new technique for composite microstructures projects by the Topology Optimization process, in order to maximize rigidity, making use of Deformation Energy Method and using a refining scheme h-adaptative to obtain a better defining the topological contours of the microstructure. This is done by distributing materials optimally in a region of pre-established project named as Cell Base. In this paper, the Finite Element Method is used to describe the field and for government equation solution. The mesh is refined iteratively refining so that the Finite Element Mesh is made on all the elements which represent solid materials, and all empty elements containing at least one node in a solid material region. The Finite Element Method chosen for the model is the linear triangular three nodes. As for the resolution of the nonlinear programming problem with constraints we were used Augmented Lagrangian method, and a minimization algorithm based on the direction of the Quasi-Newton type and Armijo-Wolfe conditions assisting in the lowering process. The Cell Base that represents the composite is found from the equivalence between a fictional material and a preescribe material, distributed optimally in the project area. The use of the strain energy method is justified for providing a lower computational cost due to a simpler formulation than traditional homogenization method. The results are presented prescription with change, in displacement with change, in volume restriction and from various initial values of relative densities.
Resumo:
This study had the goal of make a dialogue between queer theory and the thoughts of the French philosopher Maurice Merleau-Ponty in the categories of body and sexuality. From this dialogue, other goals were designed, namely: identify possible recurrences of the experience of bodies and queer sexualities, designed under Merleau-Ponty’s perspective, to the knowledge of Physical Education and reflect on this domain of knowledge using the notions of queer epistemology and esthesia. The study had as methodology the phenomenological attitude proposed by Merleau-Ponty and use the reduction as technic of research. Trying linking these thoughts we used the cinema of the Spanish director Pedro Almodóvar as perceptive strategy, an exercise of look as possibility of reading the world and new ways of perceiving the human being. We appreciate three films, namely: All About My Mother (1999), The Skin I Live In (2011) and Bad Education (2004), which put us in touch with bodies and queer sexualities, with the body of esthesia, of the ecstasy, sensations and lived experiences, un type of art whose contours are not fixed or determinable, postulate by Merleau-Ponty. The philosopher, provide a rich conceptual view of the body and their sexual experience, extends and opens horizons of thought and reflection about queer experience, one experience indeterminate and contingent as a singular way of inhabiting the world. Those horizons opened by the philosopher and added to the queer perspective contribute to put in question the modes of knowledge production and the knowledge about body and sexuality in Physical Education. Finally, we point that this theoretical conversation give us clues to reflect about the reverberations of a queer epistemology for Physical Education usiging one type of knowledge guided by esthesia and sensitivity as marks of another scientific rationality.
Resumo:
This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.
Resumo:
A pesquisa considera a difusão de celulares e smartphones e as consequências deste fato em possibilidades para o ensino-aprendizagem. Aparatos de comunicação sempre estiveram ligados ao processo de ensino-aprendizagem. Entretanto, com o desenvolvimento mais intenso, nas últimas décadas, das Tecnologias de Informação e Comunicação (TIC), essa relação vem ganhando novos contornos. Surge a Internet, a evolução das máquinas computacionais e, recentemente, a explosão dos dispositivos móveis, fornecendo novos produtos e serviços convergentes. Nesse contexto, celulares e smartphones tem sido utilizados e recomendados para apoio e complemento do processo de ensino-aprendizagem: a chamada Aprendizagem Móvel. Esse ramo cresce devido à rápida expansão e barateamento dessas tecnologias na sociedade. Para verificar cientificamente essa relação foi realizada uma pesquisa de natureza qualitativa, do tipo exploratória, com dois projetos de Aprendizagem Móvel em andamento no Brasil, o Palma – Programa de Alfabetização na Língua Materna e o Escola Com Celular – ECC. Assim, a partir dos dados provenientes da pesquisa, identificamos alguns aspectos relacionados ao uso de celulares e smartphones para o processo de ensino-aprendizagem que contribuem na compreensão desse campo ainda em construção no Brasil. O uso desses dispositivos como suporte para processos de ensino-aprendizagem nos projetos estudados é delineado pelos aspectos tecnologia, dispositivo, público e contexto e novas tecnologias e Aprendizagem Móvel. O aspecto dispositivo desdobra-se em dimensões como disseminação, multifuncionalidade e acessibilidade que embasam os projetos, ainda favorece características apontadas como importantes para o processo de ensino-aprendizagem na atualidade, como mobilidade e portabilidade. Os projetos pesquisados demonstram potencial e metodologia adequada aos contextos para os quais foram criados e aplicados. Entretanto, a pesquisa indicou que ao mesmo tempo em que celulares e smartphones representam o ápice da convergência tecnológica e são considerados extremamente populares e acessíveis na sociedade contemporânea, com possibilidades concretas como nos projetos estudados, não conseguiram conquistar uma posição sólida como suporte para o ensino-aprendizagem. Tal indicação se deve, de acordo com o corpus, à carência de alguns fatores, como: fomento, as práticas se mostram extremamente dependentes da iniciativa pública ou privada para sua extensão e continuidade; sensibilização para o uso de tecnologias disponíveis, não consideram o aparelho dos próprios alunos e um planejamento que inclua, capacite e incentive o uso desses dispositivos. Além disso, a pesquisa também destaca a necessidade de uma visão crítica do uso e papel da tecnologia nesses processos.