925 resultados para Fix and optimize
Resumo:
Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
In this paper we examine the effects of varying several experimental parameters in the Kane quantum computer architecture: A-gate voltage, the qubit depth below the silicon oxide barrier, and the back gate depth to explore how these variables affect the electron density of the donor electron. In particular, we calculate the resonance frequency of the donor nuclei as a function of these parameters. To do this we calculated the donor electron wave function variationally using an effective-mass Hamiltonian approach, using a basis of deformed hydrogenic orbitals. This approach was then extended to include the electric-field Hamiltonian and the silicon host geometry. We found that the phosphorous donor electron wave function was very sensitive to all the experimental variables studied in our work, and thus to optimize the operation of these devices it is necessary to control all parameters varied in this paper.
Resumo:
Morpho-physiological characteristics and chemical composition are directly related to superior competitive ability of crops. This study intended to make a comparative analysis of dry matter production, leaf area and amount of epicuticular wax of three species of Sida spp: S. urens L., S. rhombifolia L. and S. spinosa L. Plants were collected at three growth stages: V1: stage described as up to 10 fully expanded leaves; V2: between 11 leaves and flowering; and R: after flowering. At stages V2 and R, the highest number of leaves was recorded for S. rhombifolia, followed by S. spinosa at V2 and S. urens at R. These results were relatively proportional to leaf area for all species. S. spinosa at the vegetative stage produced the highest values of specific leaf area (SLA), with no significant differences between species at the stage R. The amount of wax per unit of leaf area between species at the same developmental stage was significantly different only at the reproductive stage, where S. spinosa produced 23.18 and 6.23 fold more wax than S. urens and S. rhombifolia respectively. Between the growth stages of each species, there was decrease in the amount of wax with plant age and increase in leaf area (AFE), number of leaves and dry matter. The leaves of the Sida species exhibit different characteristics and this information can be used to optimize the use of herbicides in the control of these weeds.
Resumo:
This paper consist in the establishment of a Virtual Producer/Consumer Agent (VPCA) in order to optimize the integrated management of distributed energy resources and to improve and control Demand Side Management DSM) and its aggregated loads. The paper presents the VPCA architecture and the proposed function-based organization to be used in order to coordinate the several generation technologies, the different load types and storage systems. This VPCA organization uses a frame work based on data mining techniques to characterize the costumers. The paper includes results of several experimental tests cases, using real data and taking into account electricity generation resources as well as consumption data.
Resumo:
World Congress of Malacology, Ponta Delgada, July 22-28, 2013.
Resumo:
This paper describes the communication stack of the REMPLI system: a structure using power-lines and IPbased networks for communication, for data acquisition and control of energy distribution and consumption. It is furthermore prepared to use alternative communication media like GSM or analog modem connections. The REMPLI system provides communication service for existing applications, namely automated meter reading, energy billing and domotic applications. The communication stack, consisting of physical, network, transport, and application layer is described as well as the communication services provided by the system. We show how the peculiarities of the power-line communication influence the design of the communication stack, by introducing requirements to efficiently use the limited bandwidth, optimize traffic and implement fair use of the communication medium for the extensive communication partners.
Resumo:
Dissertação apresentada para a obtenção do grau de Doutor em Engenharia Química, especialidade Engenharia da Reacção Química, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
We present the modeling efforts on antenna design and frequency selection to monitor brain temperature during prolonged surgery using noninvasive microwave radiometry. A tapered log-spiral antenna design is chosen for its wideband characteristics that allow higher power collection from deep brain. Parametric analysis with the software HFSS is used to optimize antenna performance for deep brain temperature sensing. Radiometric antenna efficiency (eta) is evaluated in terms of the ratio of power collected from brain to total power received by the antenna. Anatomical information extracted from several adult computed tomography scans is used to establish design parameters for constructing an accurate layered 3-D tissue phantom. This head phantom includes separate brain and scalp regions, with tissue equivalent liquids circulating at independent temperatures on either side of an intact skull. The optimized frequency band is 1.1-1.6 GHz producing an average antenna efficiency of 50.3% from a two turn log-spiral antenna. The entire sensor package is contained in a lightweight and low-profile 2.8 cm diameter by 1.5 cm high assembly that can be held in place over the skin with an electromagnetic interference shielding adhesive patch. The calculated radiometric equivalent brain temperature tracks within 0.4 degrees C of the measured brain phantom temperature when the brain phantom is lowered 10. C and then returned to the original temperature (37 degrees C) over a 4.6-h experiment. The numerical and experimental results demonstrate that the optimized 2.5-cm log-spiral antenna is well suited for the noninvasive radiometric sensing of deep brain temperature.
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Transient Spaces: unsettling boundaries and norms at the cultural event Noc Noc, Guimarães, Portugal
Resumo:
Cities are increasingly expected to be creative, inventive and to exhibit intense expressivity. In the past decades many cities have experienced growing pressure to produce and stage cultural events of different sorts and to develop new strategies that optimize competitive advantages, in order to promote themselves and to boost and sell their image. Often these actions have relied on heavy public investment and major private corporation sponsoring, but it is not always clear or measured how successful and reproductive these investments have been. In the context of strained public finances and profound economic crisis of European peripheral countries, events that emerge from local communities and have low budgets, which manage to create significant fluxes of visitors and visibility, assume an increased interest. In order to reflect and sketch possible answers, we look to an emerging body of literature concerning creative cities, and we focus on the organisation of a particular cultural event and its impact and assimilation into a medium size Portuguese city. This paper looks at the two editions (2011 and 2012) of one of such events – Noc Noc – organized by a local association in the city of Guimarães, Portugal. Inspired by similar events, Noc Noc is based on creating transient spaces of culture which are explored by artists and audiences, by transforming numerous homes into ephemeral convivial and playful social ‘public’ environments. The event is based on a number of cultural venues/homes scattered around the old and newer city, which allows for an informal urban exploration and an autonomous rambling and getting lost along streets. This strategy not only disrupts the cleavages between public and private space permitting for various transgressions, but it also disorders normative urban experiences and unsettles the dominant role of the city council as the culture patron of the large majority of events. Guimarães, an UNESCO World Heritage City was the European Capital of Culture in 2012, with a public investment of roughly 73 million euro. By interviewing a sample of people who have hosted these transitory art performances and exhibitions, sometimes doubling as artists, the events’ organizers and by experience both editions of the event, this paper illustrates how urban citizens’ engagement and motivations in a low budget cultural event can strengthen community ties. Furthermore, it also questions the advantages of large scale high budget events, and how this event may be seen as unconscious counter movement against a commodification of cultural events and everyday urban experience at large, engaging with the concepts of staging and authenticity.
Resumo:
This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
The concerns on metals in urban wastewater treatment plants (WWTPs) are mainly related to its contents in discharges to environment, namely in the final effluent and in the sludge produced. In the near future, more restrictive limits will be imposed to final effluents, due to the recent guidelines of the European Water Framework Directive (EUWFD). Concerning the sludge, at least seven metals (Cd, Cr, Cu, Hg, Ni, Pb and Zn) have been regulated in different countries, four of which were classified by EUWFD as priority substances and two of which were also classified as hazardous substances. Although WWTPs are not designed to remove metals, the study of metals behaviour in these systems is a crucial issue to develop predictive models that can help more effectively the regulation of pre-treatment requirements and contribute to optimize the systems to get more acceptable metal concentrations in its discharges. Relevant data have been published in the literature in recent decades concerning the occurrence/fate/behaviour of metals in WWTPs. However, the information is dispersed and not standardized in terms of parameters for comparing results. This work provides a critical review on this issue through a careful systematization, in tables and graphs, of the results reported in the literature, which allows its comparison and so its analysis, in order to conclude about the state of the art in this field. A summary of the main consensus, divergences and constraints found, as well as some recommendations, is presented as conclusions, aiming to contribute to a more concerted action of future research. © 2015, Islamic Azad University (IAU).
Resumo:
Mobile devices are embedded systems with very limited capacities that need to be considered when developing a client-server application, mainly due to technical, ergonomic and economic implications to the mobile user. With the increasing popularity of mobile computing, many developers have faced problems due to low performance of devices. In this paper, we discuss how to optimize and create client-server applications for in wireless/mobile environments, presenting techniques to improve overall performance.
Resumo:
Mestrado em Engenharia Informática - Área de Especialização em Sistemas Gráficos e Multimédia