938 resultados para Optimistic data replication system
Resumo:
A numerical method based on integral equations is proposed and investigated for the Cauchy problem for the Laplace equation in 3-dimensional smooth bounded doubly connected domains. To numerically reconstruct a harmonic function from knowledge of the function and its normal derivative on the outer of two closed boundary surfaces, the harmonic function is represented as a single-layer potential. Matching this representation against the given data, a system of boundary integral equations is obtained to be solved for two unknown densities. This system is rewritten over the unit sphere under the assumption that each of the two boundary surfaces can be mapped smoothly and one-to-one to the unit sphere. For the discretization of this system, Weinert’s method (PhD, Göttingen, 1990) is employed, which generates a Galerkin type procedure for the numerical solution, and the densities in the system of integral equations are expressed in terms of spherical harmonics. Tikhonov regularization is incorporated, and numerical results are included showing the efficiency of the proposed procedure.
Resumo:
In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^
Resumo:
The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^
Resumo:
The Brazilian Environmental Data Collecting System (SBCDA) collects and broadcasts meteorological and environmental data, to be handled by dozens of institutions and organizations. The system space segment, composed by the data collecting satellites, plays an important role for the system operation. To ensure the continuity and quality of these services, efforts are being made to the development of new satellite architectures. Aiming a reduction of size and power consumption, the design of an integrated circuit containing a receiver front-end is proposed, to be embedded in the next SBCDA satellite generations. The circuit will also operate under the requirements of the international data collecting standard ARGOS. This work focuses on the design of an UHF low noise amplifier and mixers in a CMOS standard technology. The specifi- cations are firstly described and the circuit topologies presented. Then the circuit conception is discussed and the design variables derived. Finally, the layout is designed and the final results are commented. The chip will be fabricated in a 130 nm technology from ST Microelectronics.
Resumo:
The growing need for food is something that worries the world, which has a population that is growing at a geometric progression while their resources grows at an arithmetic progression. To alleviate this problem there are some proposals, including increased food production or reduce waste thereof. Many studies have been conducted in the world in order to reduce food waste that can reach 40% of production, depending on the region. For this purpose techniques are used to retard degradation of foods, including drying. This paper presents a design of a hybrid fruit dryer that uses solar energy and electric energy with automation of the process. To accomplish drying tests were chosen Typical fruits with good acceptability as processed fruits. During the experiments were measured temperature values at different points. Were also measured humidity values, solar radiation and mass. A data acquisition system was built using a Arduino for obtaining temperatures. The data were sent to a program named Secador de Frutas, done in this work, to plot the same. The volume of the drying chamber was 423 liters and despite the unusual size test using mirrors to increase the incidence of direct radiation, showed that the drier is competitive when compared with other solar dryers produced in Hydraulic Machines and Solar Energy Laboratory (LMHES ) UFRN. The drier has been built at a cost of 3 to 5 times smaller than industrial dryers that operate with the same load of fruit. And the energy cost to produce dried fruits was more feasible compared with such dryers that use LPG as an energy source. However, the drying time was longer.
Resumo:
Acknowledgments We thank Edoardo Del Pezzo, Ludovic Margerin, Haruo Sato, Mare Yamamoto, Tatsuhiko Saito, Malcolm Hole, and Seth Moran for the valuable suggestions regarding the methodology and interpretation. Greg Waite provided the P wave velocity model of MSH. An important revision of the methods was done after two blind reviews performed before submission. The suggestions of two anonymous reviewers greatly enhanced our ability of imaging structures, interpreting our results, and testing their reliability. The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for access to waveform and metadata required in this study, and provided by the Cascades Volcano Observatory – USGS. Interaction with geologists and geographers part of the Landscape Dynamics Theme of the Scottish Alliance for Geoscience, Environment and Society (SAGES) has been important for the interpretation of the results.
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
Resumo:
The deep sea sedimentary record is an archive of the pre-glacial to glacial development of Antarctica and changes in climate, tectonics and ocean circulation. Identification of the pre-glacial, transitional and full glacial components in the sedimentary record is necessary for ice sheet reconstruction and to build circum-Antarctic sediment thickness grids for past topography and bathymetry reconstructions, which constrain paleoclimate models. A ~3300 km long Weddell Sea to Scotia Sea transect consisting of multichannel seismic reflection data from various organisations, were used to interpret new horizons to define the initial basin-wide seismostratigraphy and to identify the pre-glacial to glacial components. We mapped seven main units of which three are in the inferred Cretaceous-Paleocene pre-glacial regime, one in the Eocene-Oligocene transitional regime and three units in the Miocene-Pleistocene full glacial climate regime. Sparse borehole data from ODP leg 113 and SHALDRIL constrain the ages of the upper three units. Compiled seafloor spreading magnetic anomalies constrain the basement ages and the hypothetical age model. In many cases, the new horizons and stratigraphy contradict the interpretations in local studies. Each seismic sedimentary unit and its associated base horizon are continuous and traceable for the entire transect length, but reflect a lateral change in age whilst representing the same deposition process. The up to 1240 m thick pre-glacial seismic units form a mound in the central Weddell Sea basin and, in conjunction with the eroded flank geometry, support the interpretation of a Cretaceous proto-Weddell Gyre. The base reflector of the transitional seismic unit, which marks the initial ice sheet advances to the outer shelf, has a lateral model age of 26.6-15.5 Ma from southeast to northwest. The Pliocene-Pleistocene glacial deposits reveals lower sedimentations rates, indicating a reduced sediment supply. Sedimentation rates for the pre-glacial, transitional and full glacial components are highest around the Antarctic Peninsula, indicating higher erosion and sediment supply of a younger basement. We interpret an Eocene East Antarctic Ice Sheet expansion, Oligocene grounding of the West Antarctic Ice Sheet and Early Miocene grounding of the Antarctic Peninsula Ice Sheet.
Resumo:
Methane hydrate is an ice-like substance that is stable at high-pressure and low temperature in continental margin sediments. Since the discovery of a large number of gas flares at the landward termination of the gas hydrate stability zone off Svalbard, there has been concern that warming bottom waters have started to dissociate large amounts of gas hydrate and that the resulting methane release may possibly accelerate global warming. Here, we can corroborate that hydrates play a role in the observed seepage of gas, but we present evidence that seepage off Svalbard has been ongoing for at least three thousand years and that seasonal fluctuations of 1-2°C in the bottom-water temperature cause periodic gas hydrate formation and dissociation, which focus seepage at the observed sites.
Resumo:
The PolySMART demonstration system SP1b has been modeled in TRNSYS and calibrated against monitored data. The system is an example of distributed cooling with centralized CHP, where the driving heat is delivered via the district heating network. The system pre-cools the cooling water for the head office of Borlänge municipality, for which the main cooling is supplied by a 200 kW compression chiller. The SP1b system thus provides pre-cooling. It consists of ClimateWell TDC with nominal capacity of 10 kW together with a dry cooler for recooling and heat exchangers in the cooling and driving circuits. The cooling system is only operated from 06:00 to 17:00 during working days, and the cooling season is generally from mid May to mid September. The nominal operating conditions of the main chiller are 12/15°C. The main aims of this simulation study were to: reduce the electricity consumption, and if possible to improve the thermal COP and capacity at the same time; and to study how the system would perform with different boundary conditions such as climate and load. The calibration of the system model was made in three stages: estimation of parameters based on manufacturer data and dimensions of the system; calibration of each circuit (pipes and heat exchangers) separately using steady state point; and finally calibration of the complete model in terms of thermal and electrical energy as well as running times, for a five day time series of data with one minute average data values. All the performance figures were with 3% of the measured values apart from the running time for the driving circuit that was 4% different. However, the performance figures for this base case system for the complete cooling season of mid-May to midSeptember were significantly better than those for the monitoring data. This was attributed to long periods when the monitored system was not in operation and due to a control parameter that hindered cold delivery at certain times.
Resumo:
Light is the main information about the interstellar medium accessible on Earth. Based on this information one can conclude on the composition of the region where the light originates from, as well as on its history. The requirement for this is that it is possible to identify the different absorption and emission features in the spectrum and assign them to certain molecules, atoms or ions. To enable the identification of the different species, precise spectroscopic investigations of the species in the laboratory are necessary. In this work a new spectroscopic method is presented, which can be used to record pure rotational spectra of mass selected, cold, stored molecular ions. It is based on the idea of state specific attachment of helium atoms to the stored molecular ions. The new technique has been made possible through the development and recent completion of two new 22-pole ion trap instruments in the work group of Laboratory Astrophysics at the University of Cologne. These new instruments have the advantage to reach temperatures as low as 4K compared to the 10K of the predecessor instrument. These low temperatures enable the ternary attachment of helium atoms to the stored molecular ions and by this make it possible to develop this new method for pure rotational spectroscopy. According to this, this work is divided into two parts. The first part deals with the new FELion experiment that was build and characterized in the first part of the thesis. FELion is a cryogenic 22-pole ion trap apparatus, allowing to generate, mass select, store and cool down, and analyze molecular ions. The different components of the instrument, e.g. the Storage Ion Source for generating the ions or the first quadrupole mass filter, are described and characterized in this part. Besides this also the newly developed control and data acquisitions system is introduced. With this instrument the measurements presented in the second part of the work were performed. The second part deals with the new action spectroscopic method of state-selective helium attachment to the stored molecular ions. For a deeper analysis of the new technique the systems of CD+ and helium and HCO+ and helium are investigated in detail. Analytical and numerical models of the process are presented and compared to experimental results. The results of these investigations point to a seemingly very general applicability of the new method to a wide class of molecular ions. In the final part of the thesis measurements of the rotational spectrum of l-C3H+ are presented. These measurements have to be high-lighted, since it was possible for the first time in the laboratory to unambiguously measure four low-lying rotational transitions of l-C3H+. These measurements (Brünken et al. ApJL 783, L4 (2014)) enabled the reliable identification of so far unidentified emision lines observed in several regions of the interstellar medium (Pety et al. Astron. Astrophys. 548, A68 (2012), McGuire et al. The Astrophysical Journal 774, 56 (2013) and McGuire et al. The Astrophysical Journal 783, 36 (2014)).
Resumo:
Atualmente o sector industrial está inserido num mercado cada vez mais competitivo, onde é exigida uma estratégia empresarial que possa garantir a sua permanência e destaque no atual mercado. Por esta razão, um planeamento e controlo da produção adequado torna-se essencial para o bom funcionamento de uma empresa. Através destes sistemas é possível atuar de forma positiva na produção, rentabilizando-se o sector produtivo da empresa que contribui para o aumento da qualidade de serviço e também para o crescimento económico da empresa. Com um planeamento da produção adequado, uma organização dispondo das mesmas capacidades, é capaz de produzir quantidades iguais num menor intervalo de tempo. Por outro lado, um controlo da produção preciso é imprescindível para o fornecimento da informação correta quando necessária. No sentido de otimização, uma empresa apresentou algumas sugestões de melhoria a nível do planeamento e controlo da produção. Este trabalho surge assim com o intuito de dar resposta às propostas apresentadas. Para tal, no desenvolvimento desta dissertação, criou-se uma ferramenta dotada de dois algoritmos e um sistema de controlo para aquisição de informação de forma automatizada. Em suma, o sistema proposto apresenta a capacidade de construção de boas soluções para o planeamento, conciliada com um sistema de aquisição de dados bastante prático e e caz. Mantendo sempre a exibilidade necessária para um sistema deste género.
Resumo:
Este trabalho propõe um estudo de sinais cerebrais aplicados em sistemas BCI (Brain-Computer Interface - Interfaces Cérebro Computador), através do uso de Árvores de Decisão e da análise dessas árvores com base nas Neurociências. Para realizar o tratamento dos dados são necessárias 5 fases: aquisição de dados, pré-processamento, extração de características, classificação e validação. Neste trabalho, todas as fases são contempladas. Contudo, enfatiza-se as fases de classificação e de validação. Na classificação utiliza-se a técnica de Inteligência Artificial denominada Árvores de Decisão. Essa técnica é reconhecida na literatura como uma das formas mais simples e bem sucedidas de algoritmos de aprendizagem. Já a fase de validação é realizada nos estudos baseados na Neurociência, que é um conjunto das disciplinas que estudam o sistema nervoso, sua estrutura, seu desenvolvimento, funcionamento, evolução, relação com o comportamento e a mente, e também suas alterações. Os resultados obtidos neste trabalho são promissores, mesmo sendo iniciais, visto que podem melhor explicar, com a utilização de uma forma automática, alguns processos cerebrais.