943 resultados para automated full waveform logging system
Resumo:
Hybrid logic is a valuable tool for specifying relational structures, at the same time that allows defining accessibility relations between states, it provides a way to nominate and make mention to what happens at each specific state. However, due to the many sources nowadays available, we may need to deal with contradictory information. This is the reason why we came with the idea of Quasi-hybrid logic, which is a paraconsistent version of hybrid logic capable of dealing with inconsistencies in the information, written as hybrid formulas. In [5] we have already developed a semantics for this paraconsistent logic. In this paper we go a step forward, namely we study its proof-theoretical aspects. We present a complete tableau system for Quasi-hybrid logic, by combining both tableaux for Quasi-classical and Hybrid logics.
Resumo:
We examine the evolution of a bistable reaction in a one-dimensional stretching flow, as a model for chaotic advection. We derive two reduced systems of ordinary differential equations (ODEs) for the dynamics of the governing advection-reaction-diffusion partial differential equations (PDE), for pulse-like and for plateau-like solutions, based on a non-perturbative approach. This reduction allows us to study the dynamics in two cases: first, close to a saddle-node bifurcation at which a pair of nontrivial steady states are born as the dimensionless reaction rate (Damkoehler number) is increased, and, second, for large Damkoehler number, far away from the bifurcation. The main aim is to investigate the initial-value problem and to determine when an initial condition subject to chaotic stirring will decay to zero and when it will give rise to a nonzero final state. Comparisons with full PDE simulations show that the reduced pulse model accurately predicts the threshold amplitude for a pulse initial condition to give rise to a nontrivial final steady state, and that the reduced plateau model gives an accurate picture of the dynamics of the system at large Damkoehler number. Published in Physica D (2006)
Resumo:
Traditional air delivery to high-bay buildings involves ceiling level supply and return ducts that create an almost-uniform temperature in the space. Problems with this system include potential recirculation of supply air and higher-than-necessary return air temperatures. A new air delivery strategy was investigated that involves changing the height of conventional supply and return ducts to have control over thermal stratification in the space. A full-scale experiment using ten vertical temperature profiles was conducted in a manufacturing facility over one year. The experimental data was utilized to validated CFD and EnergyPlus models. CFD simulation results show that supplying air directly to the occupied zone increases stratification while holding thermal comfort constant during the cooling operation. The building energy simulation identified how return air temperature offset, set point offset, and stratification influence the building’s energy consumption. A utility bill analysis for cooling shows 28.8% HVAC energy savings while the building energy simulation shows 19.3 – 37.4% HVAC energy savings.
Resumo:
The role of T-cells within the immune system is to confirm and assess anomalous situations and then either respond to or tolerate the source of the effect. To illustrate how these mechanisms can be harnessed to solve real-world problems, we present the blueprint of a T-cell inspired algorithm for computer security worm detection. We show how the three central T-cell processes, namely T-cell maturation, differentiation and proliferation, naturally map into this domain and further illustrate how such an algorithm fits into a complete immune inspired computer security system and framework.
Resumo:
Fault tolerance allows a system to remain operational to some degree when some of its components fail. One of the most common fault tolerance mechanisms consists on logging the system state periodically, and recovering the system to a consistent state in the event of a failure. This paper describes a general fault tolerance logging-based mechanism, which can be layered over deterministic systems. Our proposal describes how a logging mechanism can recover the underlying system to a consistent state, even if an action or set of actions were interrupted mid-way, due to a server crash. We also propose different methods of storing the logging information, and describe how to deploy a fault tolerant master-slave cluster for information replication. We adapt our model to a previously proposed framework, which provided common relational features, like transactions with atomic, consistent, isolated and durable properties, to NoSQL database management systems.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
Context. With about 2000 extrasolar planets confirmed, the results show that planetary systems have a whole range of unexpected properties. This wide diversity provides fundamental clues to the processes of planet formation and evolution. Aims: We present a full investigation of the HD 219828 system, a bright metal-rich star for which a hot Neptune has previously been detected. Methods: We used a set of HARPS, SOPHIE, and ELODIE radial velocities to search for the existence of orbiting companions to HD 219828. The spectra were used to characterise the star and its chemical abundances, as well as to check for spurious, activity induced signals. A dynamical analysis is also performed to study the stability of the system and to constrain the orbital parameters and planet masses. Results: We announce the discovery of a long period (P = 13.1 yr) massive (m sini = 15.1 MJup) companion (HD 219828 c) in a very eccentric orbit (e = 0.81). The same data confirms the existence of a hot Neptune, HD 219828 b, with a minimum mass of 21 M⊕ and a period of 3.83 days. The dynamical analysis shows that the system is stable, and that the equilibrium eccentricity of planet b is close to zero. Conclusions: The HD 219828 system is extreme and unique in several aspects. First, ammong all known exoplanet systems it presents an unusually high mass ratio. We also show that systems like HD 219828, with a hot Neptune and a long-period massive companion are more frequent than similar systems with a hot Jupiter instead. This suggests that the formation of hot Neptunes follows a different path than the formation of their hot jovian counterparts. The high mass, long period, and eccentricity of HD 219828 c also make it a good target for Gaia astrometry as well as a potential target for atmospheric characterisation, using direct imaging or high-resolution spectroscopy. Astrometric observations will allow us to derive its real mass and orbital configuration. If a transit of HD 219828 b is detected, we will be able to fully characterise the system, including the relative orbital inclinations. With a clearly known mass, HD 219828 c may become a benchmark object for the range in between giant planets and brown dwarfs.
Resumo:
A smart solar photovoltaic grid system is an advent of innovation coherence of information and communications technology (ICT) with power systems control engineering via the internet [1]. This thesis designs and demonstrates a smart solar photovoltaic grid system that is selfhealing, environmental and consumer friendly, but also with the ability to accommodate other renewable sources of energy generation seamlessly, creating a healthy competitive energy industry and optimising energy assets efficiency. This thesis also presents the modelling of an efficient dynamic smart solar photovoltaic power grid system by exploring the maximum power point tracking efficiency, optimisation of the smart solar photovoltaic array through modelling and simulation to improve the quality of design for the solar photovoltaic module. In contrast, over the past decade quite promising results have been published in literature, most of which have not addressed the basis of the research questions in this thesis. The Levenberg-Marquardt and sparse based algorithms have proven to be very effective tools in helping to improve the quality of design for solar photovoltaic modules, minimising the possible relative errors in this thesis. Guided by theoretical and analytical reviews in literature, this research has carefully chosen the MatLab/Simulink software toolbox for modelling and simulation experiments performed on the static smart solar grid system. The auto-correlation coefficient results obtained from the modelling experiments give an accuracy of 99% with negligible mean square error (MSE), root mean square error (RMSE) and standard deviation. This thesis further explores the design and implementation of a robust real-time online solar photovoltaic monitoring system, establishing a comparative study of two solar photovoltaic tracking systems which provide remote access to the harvested energy data. This research made a landmark innovation in designing and implementing a unique approach for online remote access solar photovoltaic monitoring systems providing updated information of the energy produced by the solar photovoltaic module at the site location. In addressing the challenge of online solar photovoltaic monitoring systems, Darfon online data logger device has been systematically integrated into the design for a comparative study of the two solar photovoltaic tracking systems examined in this thesis. The site location for the comparative study of the solar photovoltaic tracking systems is at the National Kaohsiung University of Applied Sciences, Taiwan, R.O.C. The overall comparative energy output efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic monitoring system as observed at the research location site is about 72% based on the total energy produced, estimated money saved and the amount of CO2 reduction achieved. Similarly, in comparing the total amount of energy produced by the two solar photovoltaic tracking systems, the overall daily generated energy for the month of July shows the effectiveness of the azimuthal-altitude tracking systems over the 450 stationary solar photovoltaic system. It was found that the azimuthal-altitude dual-axis tracking systems were about 68.43% efficient compared to the 450 stationary solar photovoltaic systems. Lastly, the overall comparative hourly energy efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic energy system was found to be 74.2% efficient. Results from this research are quite promising and significant in satisfying the purpose of the research objectives and questions posed in the thesis. The new algorithms introduced in this research and the statistical measures applied to the modelling and simulation of a smart static solar photovoltaic grid system performance outperformed other previous works in reviewed literature. Based on this new implementation design of the online data logging systems for solar photovoltaic monitoring, it is possible for the first time to have online on-site information of the energy produced remotely, fault identification and rectification, maintenance and recovery time deployed as fast as possible. The results presented in this research as Internet of things (IoT) on smart solar grid systems are likely to offer real-life experiences especially both to the existing body of knowledge and the future solar photovoltaic energy industry irrespective of the study site location for the comparative solar photovoltaic tracking systems. While the thesis has contributed to the smart solar photovoltaic grid system, it has also highlighted areas of further research and the need to investigate more on improving the choice and quality design for solar photovoltaic modules. Finally, it has also made recommendations for further research in the minimization of the absolute or relative errors in the quality and design of the smart static solar photovoltaic module.
Resumo:
The work presented herein focused on the automation of coordination-driven self assembly, exploring methods that allow syntheses to be followed more closely while forming new ligands, as part of the fundamental study of the digitization of chemical synthesis and discovery. Whilst the control and understanding of the principle of pre-organization and self-sorting under non-equilibrium conditions remains a key goal, a clear gap has been identified in the absence of approaches that can permit fast screening and real-time observation of the reaction process under different conditions. A firm emphasis was thus placed on the realization of an autonomous chemical robot, which can not only monitor and manipulate coordination chemistry in real-time, but can also allow the exploration of a large chemical parameter space defined by the ligand building blocks and the metal to coordinate. The self-assembly of imine ligands with copper and nickel cations has been studied in a multi-step approach using a self-built flow system capable of automatically controlling the liquid-handling and collecting data in real-time using a benchtop MS and NMR spectrometer. This study led to the identification of a transient Cu(I) species in situ which allows for the formation of dimeric and trimeric carbonato bridged Cu(II) assemblies. Furthermore, new Ni(II) complexes and more remarkably also a new binuclear Cu(I) complex, which usually requires long and laborious inert conditions, could be isolated. The study was then expanded to the autonomous optimization of the ligand synthesis by enabling feedback control on the chemical system via benchtop NMR. The synthesis of new polydentate ligands has emerged as a result of the study aiming to enhance the complexity of the chemical system to accelerate the discovery of new complexes. This type of ligand consists of 1-pyridinyl-4-imino-1,2,3-triazole units, which can coordinate with different metal salts. The studies to test for the CuAAC synthesis via microwave lead to the discovery of four new Cu complexes, one of them being a coordination polymer obtained from a solvent dependent crystallization technique. With the goal of easier integration into an automated system, copper tubing has been exploited as the chemical reactor for the synthesis of this ligand, as it efficiently enhances the rate of the triazole formation and consequently promotes the formation of the full ligand in high yields within two hours. Lastly, the digitization of coordination-driven self-assembly has been realized for the first time using an in-house autonomous chemical robot, herein named the ‘Finder’. The chemical parameter space to explore was defined by the selection of six variables, which consist of the ligand precursors necessary to form complex ligands (aldehydes, alkineamines and azides), of the metal salt solutions and of other reaction parameters – duration, temperature and reagent volumes. The platform was assembled using rounded bottom flasks, flow syringe pumps, copper tubing, as an active reactor, and in-line analytics – a pH meter probe, a UV-vis flow cell and a benchtop MS. The control over the system was then obtained with an algorithm capable of autonomously focusing the experiments on the most reactive region (by avoiding areas of low interest) of the chemical parameter space to explore. This study led to interesting observations, such as metal exchange phenomena, and also to the autonomous discovery of self assembled structures in solution and solid state – such as 1-pyridinyl-4-imino-1,2,3-triazole based Fe complexes and two helicates based on the same ligand coordination motif.
Resumo:
Various environmental management systems, standards and tools are being created to assist companies to become more environmental friendly. However, not all the enterprises have adopted environmental policies in the same scale and range. Additionally, there is no existing guide to help them determine their level of environmental responsibility and subsequently, provide support to enable them to move forward towards environmental responsibility excellence. This research proposes the use of a Belief Rule-Based approach to assess an enterprise’s level commitment to environmental issues. The Environmental Responsibility BRB assessment system has been developed for this research. Participating companies will have to complete a structured questionnaire. An automated analysis of their responses (using the Belief Rule-Based approach) will determine their environmental responsibility level. This is followed by a recommendation on how to progress to the next level. The recommended best practices will help promote understanding, increase awareness, and make the organization greener. BRB systems consist of two parts: Knowledge Base and Inference Engine. The knowledge base in this research is constructed after an in-depth literature review, critical analyses of existing environmental performance assessment models and primarily guided by the EU Draft Background Report on "Best Environmental Management Practice in the Telecommunications and ICT Services Sector". The reasoning algorithm of a selected Drools JBoss BRB inference engine is forward chaining, where an inference starts iteratively searching for a pattern-match of the input and if-then clause. However, the forward chaining mechanism is not equipped with uncertainty handling. Therefore, a decision is made to deploy an evidential reasoning and forward chaining with a hybrid knowledge representation inference scheme to accommodate imprecision, ambiguity and fuzzy types of uncertainties. It is believed that such a system generates well balanced, sensible and Green ICT readiness adapted results, to help enterprises focus on making improvements on more sustainable business operations.
Resumo:
Part 20: Health and Care Networks
Resumo:
Part 12: Collaboration Platforms
Resumo:
Part 6: Engineering and Implementation of Collaborative Networks
Resumo:
A comprehensive environmental monitoring program was conducted in the Ojo Guareña cave system (Spain), one of the longest cave systems in Europe, to assess the magnitude of the spatiotemporal changes in carbon dioxide gas (CO2) in the cave–soil–atmosphere profile. The key climate-driven processes involved in gas exchange, primarily gas diffusion and cave ventilation due to advective forces, were characterized. The spatial distributions of both processes were described through measurements of CO2 and its carbon isotopic signal (δ13C[CO2]) from exterior, soil and cave air samples analyzed by cavity ring-down spectroscopy (CRDS). The trigger mechanisms of air advection (temperature or air density differences or barometric imbalances) were controlled by continuous logging systems. Radon monitoring was also used to characterize the changing airflow that results in a predictable seasonal or daily pattern of CO2 concentrations and its carbon isotopic signal. Large daily oscillations of CO2 levels, ranging from 680 to 1900 ppm day−1 on average, were registered during the daily oscillations of the exterior air temperature around the cave air temperature. These daily variations in CO2 concentration were unobservable once the outside air temperature was continuously below the cave temperature and a prevailing advective-renewal of cave air was established, such that the daily-averaged concentrations of CO2 reached minimum values close to atmospheric background. The daily pulses of CO2 and other tracer gases such as radon (222Rn) were smoothed in the inner cave locations, where fluctuation of both gases was primarily correlated with medium-term changes in air pressure. A pooled analysis of these data provided evidence that atmospheric air that is inhaled into dynamically ventilated caves can then return to the lower troposphere as CO2-rich cave air.