993 resultados para Free interface
Resumo:
Os programas de gravação e edição de áudio em ambientes multi-faixa são populares entre os músicos, para desenvolverem o seu trabalho. Estes programas apresentam funcionalidades de gravação e edição, mas não promovem o trabalho colaborativo entre músicos. De forma a colaborar, os vários elementos de uma banda musical têm de se reunir no mesmo local físico. Com este trabalho pretende-se criar uma solução para a colaboração no contexto da gravação e edição de áudio. Tem-se como objectivo o desenvolvimento de uma aplicação distribuída que facilite a gravação e edição de áudio, estando os elementos de cada banda musical em localizações físicas distintas. A aplicação desenvolvida tem funcionalidades de manipulação de áudio, bem como mecanismos para a sincronização do trabalho entre os vários elementos da banda. A manipulação de áudio consiste em reprodução, gravação, codificação e edição de áudio. O áudio é manipulado no formato Microsoft WAV, resultante da digitalização do áudio em Pulse Code Modulation (PCM) e posteriormente codificado em FLAC (Free Lossless Audio Codec) ou MP3 (Mpeg-1 Layer 3) de forma a minimizar a dimensão do ficheiro, diminuindo assim o espaço que ocupa em disco e a largura de banda necessária à sua transmissão pela internet. A edição consiste na aplicação de operações como amplificação, ecos, entre outros. Os elementos da banda instalam no seu computador a aplicação cliente, com interface gráfica onde desenvolvem o seu trabalho. Esta aplicação cliente mantém a lógica de sincronização do trabalho colaborativo, inserindo-se como um dos peers da arquitectura peer-to-peer híbrida da aplicação distribuída. Estes peers comunicam entre si, enviando informação acerca das operações aplicadas e áudio gravado pelos membros da banda.
Resumo:
We characterize the elastic contribution to the surface free energy of a nematic liquid crystal in the presence of a sawtooth substrate. Our findings are based on numerical minimization of the Landau-de Gennes model and analytical calculations on the Frank-Oseen theory. The nucleation of disclination lines (characterized by non-half-integer winding numbers) in the wedges and apexes of the substrate induces a leading order proportional to q ln q to the elastic contribution to the surface free-energy density, with q being the wave number associated with the substrate periodicity.
Resumo:
The crustal and lithospheric mantle structure at the south segment of the west Iberian margin was investigated along a 370 km long seismic transect. The transect goes from unthinned continental crust onshore to oceanic crust, crossing the ocean-continent transition (OCT) zone. The wide-angle data set includes recordings from 6 OBSs and 2 inland seismic stations. Kinematic and dynamic modeling provided a 2D velocity model that proved to be consistent with the modeled free-air anomaly data. The interpretation of coincident multi-channel near-vertical and wide-angle reflection data sets allowed the identification of four main crustal domains: (i) continental (east of 9.4 degrees W); (ii) continental thinning (9.4 degrees W-9.7 degrees W): (iii) transitional (9.7 degrees W-similar to 10.5 degrees W); and (iv) oceanic (west of similar to 10.5 degrees W). In the continental domain the complete crustal section of slightly thinned continental crust is present. The upper (UCC, 5.1-6.0 km/s) and the lower continental crust (LCC, 6.9-7.2 km/s) are seismically reflective and have intermediate to low P-wave velocity gradients. The middle continental crust (MCC, 6.35-6.45 km/s) is generally unreflective with low velocity gradient. The main thinning of the continental crust occurs in the thinning domain by attenuation of the UCC and the LCC. Major thinning of the MCC starts to the west of the LCC pinchout point, where it rests directly upon the mantle. In the thinning domain the Moho slope is at least 13 degrees and the continental crust thickness decreases seaward from 22 to 11 km over a similar to 35 km distance, stretched by a factor of 1.5 to 3. In the oceanic domain a two-layer high-gradient igneous crust (5.3-6.0 km/s; 6.5-7.4 km/s) was modeled. The intra-crustal interface correlates with prominent mid-basement, 10-15 km long reflections in the multi-channel seismic profile. Strong secondary reflected PmP phases require a first order discontinuity at the Moho. The sedimentary cover can be as thick as 5 km and the igneous crustal thickness varies from 4 to 11 km in the west, where the profile reaches the Madeira-Tore Rise. In the transitional domain the crust has a complex structure that varies both horizontally and vertically. Beneath the continental slope it includes exhumed continental crust (6.15-6.45 km/s). Strong diffractions were modeled to originate at the lower interface of this layer. The western segment of this transitional domain is highly reflective at all levels, probably due to dykes and sills, according to the high apparent susceptibility and density modeled at this location. Sub-Moho mantle velocity is found to be 8.0 km/s, but velocities smaller than 8.0 km/s confined to short segments are not excluded by the data. Strong P-wave wide-angle reflections are modeled to originate at depth of 20 km within the lithospheric mantle, under the eastern segment of the oceanic domain, or even deeper at the transitional domain, suggesting a layered structure for the lithospheric mantle. Both interface depths and velocities of the continental section are in good agreement to the conjugate Newfoundland margin. A similar to 40 km wide OCT having a geophysical signature distinct from the OCT to the north favors a two pulse continental breakup.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
We have performed Surface Evolver simulations of two-dimensional hexagonal bubble clusters consisting of a central bubble of area lambda surrounded by s shells or layers of bubbles of unit area. Clusters of up to twenty layers have been simulated, with lambda varying between 0.01 and 100. In monodisperse clusters (i.e., for lambda = 1) [M.A. Fortes, F Morgan, M. Fatima Vaz, Philos. Mag. Lett. 87 (2007) 561] both the average pressure of the entire Cluster and the pressure in the central bubble are decreasing functions of s and approach 0.9306 for very large s, which is the pressure in a bubble of an infinite monodisperse honeycomb foam. Here we address the effect of changing the central bubble area lambda. For small lambda the pressure in the central bubble and the average pressure were both found to decrease with s, as in monodisperse clusters. However, for large,, the pressure in the central bubble and the average pressure increase with s. The average pressure of large clusters was found to be independent of lambda and to approach 0.9306 asymptotically. We have also determined the cluster surface energies given by the equation of equilibrium for the total energy in terms of the area and the pressure in each bubble. When the pressures in the bubbles are not available, an approximate equation derived by Vaz et al. [M. Fatima Vaz, M.A. Fortes, F. Graner, Philos. Mag. Lett. 82 (2002) 575] was shown to provide good estimations for the cluster energy provided the bubble area distribution is narrow. This approach does not take cluster topology into account. Using this approximate equation, we find a good correlation between Surface Evolver Simulations and the estimated Values of energies and pressures. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We investigate nematic wetting and filling transitions of crenellated surfaces (rectangular gratings) by numerical minimization of the Landau-de Gennes free energy as a function of the anchoring strength, for a wide range of the surface geometrical parameters: depth, width, and separation of the crenels. We have found a rich phase behavior that depends in detail on the combination of the surface parameters. By comparison to simple fluids, which undergo a continuous filling or unbending transition, where the surface changes from a dry to a filled state, followed by a wetting or unbinding transition, where the thickness of the adsorbed fluid becomes macroscopic and the interface unbinds from the surface, nematics at crenellated surfaces reveal an intriguingly rich behavior: in shallow crenels only wetting is observed, while in deep crenels, only filling transitions occur; for intermediate surface geometrical parameters, a new class of filled states is found, characterized by bent isotropic-nematic interfaces, which persist for surfaces structured on large scales, compared to the nematic correlation length. The global phase diagram displays two wet and four filled states, all separated by first-order transitions. For crenels in the intermediate regime re-entrant filling transitions driven by the anchoring strength are observed.
Resumo:
A utilização de equipamentos de climatização é cada vez mais frequente, e surgem novas tecnologias para aumentar a eficiência do processo, e neste caso, a opção da instalação de um sistema de Unidade de Tratamento de Ar com Economizador é a fundamental temática deste trabalho de dissertação. O “Free-Cooling” baseia-se na utilização total ou parcial do ar exterior para proceder à climatização de um espaço, quando se verificam as condições ótimas para o processo, e quando o sistema apresenta um controlador que permita gerir a abertura dos registos face à temperatura exterior e interior medida. A análise das condições exteriores e interiores é fundamental para dimensionar um Economizador. É necessário determinar o tipo de clima do local para fazer a seleção do tipo de controlo do processo, e recolher também, o perfil de temperaturas exterior para justificar a utilização de “Free-Cooling” no local. A determinação das condições interiores como a quantificação da utilização da iluminação, ocupação e equipamentos, é necessária para determinar a potência das baterias de arrefecimento ou aquecimento, e no caso de ser utilizado “Free-Cooling”, determinar o caudal de ar exterior a insuflar. O balanço térmico das instalações explicita todas as cargas influentes no edifício, e permite quantificar a potência necessária para climatização. Depois, adicionando o Economizador no sistema e comparando os dois sistemas, verifica-se a redução dos custos de utilização da bateria de arrefecimento. O desenvolvimento de um algoritmo de controlo é fundamental para garantir a eficiência do Economizador, onde o controlo dos registos de admissão e retorno de ar é obrigatoriamente relacionado com a leitura dos sensores de temperatura exterior e interior. A quantidade de ar novo insuflado no espaço depende, por fim, da relação entre a carga sensível do local e a diferença de temperatura lida entre os dois sensores.
Resumo:
In this paper we present a user-centered interface for a scheduling system. The purpose of this interface is to provide graphical and interactive ways of defining a scheduling problem. To create such user interface an evaluation-centered user interaction development method was adopted: the star life cycle. The created prototype comprises the Task Module and the Scheduling Problem Module. The first one allows users to define a sequence of operations, i.e., a task. The second one enables a scheduling problem definition, which consists in a set of tasks. Both modules are equipped with a set of real time validations to assure the correct definition of the necessary data input for the scheduling module of the system. The usability evaluation allowed us to measure the ease of interaction and observe the different forms of interaction provided by each participant, namely the reactions to the real time validation mechanism.
Resumo:
Biodiesel is the main alternative to fossil diesel and it may be produced from different feedstocks such as semi-refined vegetable oils, waste frying oils or animal fats. However, these feedstocks usually contain significant amounts of free fatty acids (FFA) that make them inadequate for the direct base catalyzed transesterification reaction (where the FFA content should be lower than 4%). The present work describes a possible method for the pre-treatment of oils with a high content of FFA (20 to 50%) by esterification with glycerol. In order to reduce the FFA content, the reaction between these FFA and an esterification agent is carried out before the transesterification reaction. The reaction kinetics was studied in terms of its main factors such astemperature, % of glycerin excess, % of catalyst used, stirring velocity and type of catalyst used. The results showed that glycerolysis is a promising pretreatment to acidic oils or fats (> 20%) as they led to the production of an intermediary material with a low content of FFA that can be used directly in thetransesterification reaction for the production of biodiesel. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
O presente trabalho final de mestrado refere-se ao estágio curricular realizado no âmbito do Mestrado de Engenharia Civil no ramo de edificações no Instituto Superior de Engenharia de Lisboa. O estágio decorreu na empresa ENGEXPOR Consultores de Engenharia, S.A. na obra do edifício de escritórios Metropólis Interface Sul – ZON Multimédia centrando-se principalmente na empreitada de acabamentos, revestimentos e instalações técnicas especiais, o qual decorreu entre os meses de Fevereiro e Setembro de 2012. Durante o estágio a aluna foi integrada numa equipa jovem, dinâmica e pró-ativa, onde desempenhou diversas funções de gestão, coordenação e acompanhamento dos trabalhos da envolvente exterior do edifício, com o objetivo de desenvolvimento das suas competências, fundamentalmente de compreensão e análise dos processos construtivos, do projeto, das relações entre os vários intervenientes e garantia de qualidade.
Resumo:
In this paper a new free flight instrument is presented. The instrument named FlyMaster distinguishes from others not only at hardware level, since it is the first one based on a PDA and with an RF interface for wireless sensors, but also at software level once its structure was developed following some guidelines from Ambient Intelligence and ubiquitous and context aware mobile computing. In this sense the software has several features which avoid pilot intervention during flight. Basically, the FlyMaster adequate the displayed information to each flight situation. Furthermore, the FlyMaster has its one way of show information.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.