959 resultados para Lagrangian Formulation
Resumo:
The authoritarian regime of the Portuguese Estado Novo (New State), the longest dictatorship in twentieth-century Western Europe, suffered one of its most serious threats during the late 1950s and the whole of the following decade. An array of events and dynamics of opposition to the regime and condemnation of the political and social situation in Portugal appeared at that time. One of the core groups that displayed their dissidence in the 1960s, with the awakening of their critical conscience, originated in Catholic sectors that rallied the laity and the clergy to express their disagreement or even break with the government of Salazar (and, later, Marcelo Caetano). This article aims to establish the role of print culture and, in particular, publishing in the opposition’s mobilisation of Catholics who criticised the Estado Novo. It will also closely examine the contribution of certain publishers to the formulation of the terms of this mobilisation, in publishing new authors and topics and creating new printed forums (e.g. periodicals) for discussion and reflection. The most detailed case will be that of the publishing house Livraria Moraes Editora, under the command of the publisher António Alçada Baptista.
Resumo:
This letter presents a new parallel method for hyperspectral unmixing composed by the efficient combination of two popular methods: vertex component analysis (VCA) and sparse unmixing by variable splitting and augmented Lagrangian (SUNSAL). First, VCA extracts the endmember signatures, and then, SUNSAL is used to estimate the abundance fractions. Both techniques are highly parallelizable, which significantly reduces the computing time. A design for the commodity graphics processing units of the two methods is presented and evaluated. Experimental results obtained for simulated and real hyperspectral data sets reveal speedups up to 100 times, which grants real-time response required by many remotely sensed hyperspectral applications.
Resumo:
OBJECTIVE To analyze the patterns and legal requirements of methylphenidate consumption. METHODS We conducted a cross-sectional study of the data from prescription notification forms and balance lists of drugs sales – psychoactive and others – subject to special control in the fifth largest city of Brazil, in 2006. We determined the defined and prescribed daily doses, the average prescription and dispensation periods, and the regional sales distribution in the municipality. In addition, we estimated the costs of drug acquisition and analyzed the individual drug consumption profile using the Lorenz curve. RESULTS The balance lists data covered all notified sales of the drug while data from prescription notification forms covered 50.6% of the pharmacies that sold it, including those with the highest sales volumes. Total methylphenidate consumption was 0.37 DDD/1,000 inhabitants/day. Sales were concentrated in more developed areas, and regular-release tablets were the most commonly prescribed pharmaceutical formulation. In some regions of the city, approximately 20.0% of the prescriptions and dispensation exceeded 30 mg/day and 30 days of treatment. CONCLUSIONS Methylphenidate was widely consumed in the municipality and mainly in the most developed areas. Of note, the consumption of formulations with the higher abuse risk was the most predominant. Both its prescription and dispensation contrasted with current pharmacotherapeutic recommendations and legal requirements. Therefore, the commercialization of methylphenidate should be monitored more closely, and its use in the treatment of behavioral changes of psychological disorders needs to be discussed in detail, in line with the concepts of the quality use of medicines.
Resumo:
Trabalho realizado sob orientação do Prof. António Brandão Moniz para a disciplina “Factores Sociais da Inovação” do Mestrado Engenharia Informática realizado na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa
Resumo:
The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.
Resumo:
Thrust ball bearings lubricated with several different greases were tested on a modified Four-Ball Machine, where the Four-Ball arrangement was replaced by a bearing assembly. The friction torque and operating temperatures in a thrust ball bearing were measured during the tests. At the end of each test a grease sample was analyzed through ferrographic techniques in order to quantify and evaluate bearing wear. A rolling bearing friction torque model was used and the coefficient of friction in full film lubrication was determined for each grease, depending on the operating conditions. The experimental results obtained showed that grease formulation had a very significant influence on friction torque and operating temperature. The friction torque depends on the viscosity of the grease base oil, on its nature (mineral, ester, PAO, etc.), on the coefficient of friction in full film conditions, but also on the interaction between grease thickener and base oil, which affected contact replenishment and contact starvation, and thus influenced the friction torque.
Resumo:
The aim of this study was to evaluate the adequacy of the Brazilian legislation about fluoride toothpaste. A search was conducted in LILACS, Medline and SciELO databases about the fluoride concentration found in Brazilians toothpastes, using descriptors on health. Publications since 1981 have shown that some Brazilian toothpastes are not able to maintain, during their expiration time, a minimum of 1,000 ppm F of soluble fluoride in the formulation. However, the Brazilian regulation (ANVISA, Resolution 79, August 28, 2000) only sets the maximum total fluoride (0.15%; 1,500 ppm F) that a toothpaste may contain but not the minimum concentration of soluble fluoride that it should contain to have anticaries potential, which according to systematic reviews should be 1,000 ppm F. Therefore, the Brazilian regulation on fluoride toothpastes needs to be revised to assure the efficacy of those products for caries control.
Resumo:
In this study new free-trade agreements are discussed, which are based on the breaking down of tariff and technical barriers and normally exclude most of the poorest countries in the world. Considering the current context of economic globalization and its health impacts, seven controversial points of these treaties and their possible implications for global public health are presented, mainly regarding health equity and other health determinants. Finally, this research proposes a greater social and health professionals participation in the formulation and discussion of these treaties, and a deeper insertion of Brazil in this important international agenda.
Resumo:
This paper addresses the calculation of derivatives of fractional order for non-smooth data. The noise is avoided by adopting an optimization formulation using genetic algorithms (GA). Given the flexibility of the evolutionary schemes, a hierarchical GA composed by a series of two GAs, each one with a distinct fitness function, is established.
Resumo:
This paper introduces a new unsupervised hyperspectral unmixing method conceived to linear but highly mixed hyperspectral data sets, in which the simplex of minimum volume, usually estimated by the purely geometrically based algorithms, is far way from the true simplex associated with the endmembers. The proposed method, an extension of our previous studies, resorts to the statistical framework. The abundance fraction prior is a mixture of Dirichlet densities, thus automatically enforcing the constraints on the abundance fractions imposed by the acquisition process, namely, nonnegativity and sum-to-one. A cyclic minimization algorithm is developed where the following are observed: 1) The number of Dirichlet modes is inferred based on the minimum description length principle; 2) a generalized expectation maximization algorithm is derived to infer the model parameters; and 3) a sequence of augmented Lagrangian-based optimizations is used to compute the signatures of the endmembers. Experiments on simulated and real data are presented to show the effectiveness of the proposed algorithm in unmixing problems beyond the reach of the geometrically based state-of-the-art competitors.
Resumo:
An improved class of Boussinesq systems of an arbitrary order using a wave surface elevation and velocity potential formulation is derived. Dissipative effects and wave generation due to a time-dependent varying seabed are included. Thus, high-order source functions are considered. For the reduction of the system order and maintenance of some dispersive characteristics of the higher-order models, an extra O(mu 2n+2) term (n ??? N) is included in the velocity potential expansion. We introduce a nonlocal continuous/discontinuous Galerkin FEM with inner penalty terms to calculate the numerical solutions of the improved fourth-order models. The discretization of the spatial variables is made using continuous P2 Lagrange elements. A predictor-corrector scheme with an initialization given by an explicit RungeKutta method is also used for the time-variable integration. Moreover, a CFL-type condition is deduced for the linear problem with a constant bathymetry. To demonstrate the applicability of the model, we considered several test cases. Improved stability is achieved.
Resumo:
IEE Proceedings Vision, Image & Signal Processing, vol. 152, nº 6
Resumo:
Num mercado cada vez mais competitivo, torna-se fundamental para as empresas produzirem mais com menos recursos, aumentando a eficiência interna, através da otimização dos seus processos. Neste contexto aparece o Lean Manufacturing, metodologia que tem como objetivo criar valor para os stakeholders, através da eliminação de desperdício na cadeia de valor. Este projeto descreve a análise e a formulação de soluções do processo de produção de um Módulo de Serviço, produto que faz parte do sistema elétrico de um elevador. Para análise do problema utilizamos técnicas e ferramentas lean, tais como, o value stream mapping (VSM), o diagrama de processo e o diagrama de spaghetti. Para formulação do problema usamos o value stream design (VSD), a metodologia 5S, o sistema Kanban e a criação de fluxo contínuo, através do conceito takt time, do sistema Pull, da definição do processo pacemaker, da programação nivelada (Heijunka), do conceito pitch time e da caixa de nivelamento (Heijunka Box). Com este projeto pretendemos demonstrar que a implementação de um fluxo unitário de peças através da filosofia Lean Manufacturing, acrescenta qualidade ao produto, cria flexibilidade, aumenta a produtividade, liberta áreas de produção, aumenta a segurança, reduz o custo com o stock e aumenta a motivação organizacional.
Resumo:
A alta e crescente participação da energia eólica na matriz da produção traz grandes desafios aos operadores do sistema na gestão da rede e planeamento da produção. A incerteza associada à produção eólica condiciona os processos de escalonamento e despacho económico dos geradores térmicos, uma vez que a produção eólica efetiva pode ser muito diferente da produção prevista. O presente trabalho propõe duas metodologias de otimização do escalonamento de geradores térmicos baseadas em Programação Inteira Mista. Pretende-se encontrar soluções de escalonamento que minimizem as influências negativas da integração de energia eólica no sistema elétrico. Inicialmente o problema de escalonamento de geradores é formulado sem considerar a integração da energia eólica. Posteriormente foi considerada a penetração da energia eólica no sistema elétrico. No primeiro modelo proposto, o problema é formulado como um problema de otimização estocástico. Nesta formulação todos os cenários de produção eólica são levados em consideração no processo de otimização. No segundo modelo, o problema é formulado como um problema de otimização determinística. Nesta formulação, o escalonamento é feito para cada cenário de produção eólica e no fim determina-se a melhor solução por meio de indicadores de avaliação. Foram feitas simulações para diferentes níveis de reserva girante e os resultados obtidos mostraram que a alta participação da energia eólica na matriz da produção põe em causa a segurança e garantia de produção devido às características volátil e intermitente da produção eólica e para manter os mesmos níveis de segurança é preciso dispor no sistema de capacidade reserva girante suficiente capaz de compensar os erros de previsão.
Resumo:
O aproveitamento de pneus em fim de vida revela ser uma alternativa eficaz e promissora na indústria da construção civil, na utilização deste resíduo em muros de suporte. O presente trabalho tem como principal objetivo a apresentação de uma técnica de aproveitamento de pneus em fim de vida na execução de muros de gravidade, combinando solo e pneus. Neste sentido, tomou-se como referência um estudo realizado no Brasil por Sieira, Sayão, Medeiros e Gerscovich, para avaliar a eficiência e o custo deste tipo de estruturas, comparando-o com um muro de suporte tradicional de betão simples. Inicialmente, avaliou-se a segurança do muro de solo-pneus, de acordo com a metodologia proposta no Eurocódigo 7 (NP EN 1997-1, 2010), considerando a geometria e as características dos materiais apresentados no estudo referido e usando o programa de cálculo automático Slide, da Rocscience, para a verificação da estabilidade global. Reproduziu-se a análise numérica realizada no âmbito do caso de estudo brasileiro de referência, recorrendo também a uma formulação por elementos finitos com o programa de cálculo automático Phase2, da Rocscience. Por último, utilizando uma vez mais o programa Slide, definiu-se a geometria de um muro de betão simples cuja geometria garantisse o mesmo valor do fator de segurança à estabilidade global, obtido com o muro de solo-pneus e compararam-se os custos respetivos. O presente trabalho confirmou a eficiência e o baixo custo desta solução construtiva, sendo necessários, no entanto, estudos mais detalhados que reforcem estas conclusões.