877 resultados para Real applications
Resumo:
Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.
Resumo:
Resource allocation decisions are made to serve the current emergency without knowing which future emergency will be occurring. Different ordered combinations of emergencies result in different performance outcomes. Even though future decisions can be anticipated with scenarios, previous models follow an assumption that events over a time interval are independent. This dissertation follows an assumption that events are interdependent, because speed reduction and rubbernecking due to an initial incident provoke secondary incidents. The misconception that secondary incidents are not common has resulted in overlooking a look-ahead concept. This dissertation is a pioneer in relaxing the structural assumptions of independency during the assignment of emergency vehicles. When an emergency is detected and a request arrives, an appropriate emergency vehicle is immediately dispatched. We provide tools for quantifying impacts based on fundamentals of incident occurrences through identification, prediction, and interpretation of secondary incidents. A proposed online dispatching model minimizes the cost of moving the next emergency unit, while making the response as close to optimal as possible. Using the look-ahead concept, the online model flexibly re-computes the solution, basing future decisions on present requests. We introduce various online dispatching strategies with visualization of the algorithms, and provide insights on their differences in behavior and solution quality. The experimental evidence indicates that the algorithm works well in practice. After having served a designated request, the available and/or remaining vehicles are relocated to a new base for the next emergency. System costs will be excessive if delay regarding dispatching decisions is ignored when relocating response units. This dissertation presents an integrated method with a principle of beginning with a location phase to manage initial incidents and progressing through a dispatching phase to manage the stochastic occurrence of next incidents. Previous studies used the frequency of independent incidents and ignored scenarios in which two incidents occurred within proximal regions and intervals. The proposed analytical model relaxes the structural assumptions of Poisson process (independent increments) and incorporates evolution of primary and secondary incident probabilities over time. The mathematical model overcomes several limiting assumptions of the previous models, such as no waiting-time, returning rule to original depot, and fixed depot. The temporal locations flexible with look-ahead are compared with current practice that locates units in depots based on Poisson theory. A linearization of the formulation is presented and an efficient heuristic algorithm is implemented to deal with a large-scale problem in real-time.
Resumo:
For various reasons, many Algol 68 compilers do not directly implement the parallel processing operations defined in the Revised Algol 68 Report. It is still possible however, to perform parallel processing, multitasking and simulation provided that the implementation permits the creation of a master routine for the coordination and initiation of processes under its control. The package described here is intended for real time applications and runs in conjunction with the Algol 68R system; it extends and develops the original Algol 68RT package, which was designed for use with multiplexers at the Royal Radar Establishment, Malvern. The facilities provided, in addition to the synchronising operations, include an interface to an ICL Communications Processor enabling the abstract processes to be realised as the interaction of several teletypes or visual display units with a real time program providing a useful service.
Resumo:
A utilização de sistemas embutidos distribuídos em diversas áreas como a robótica, automação industrial e aviónica tem vindo a generalizar-se no decorrer dos últimos anos. Este tipo de sistemas são compostos por vários nós, geralmente designados por sistemas embutidos. Estes nós encontram-se interligados através de uma infra-estrutura de comunicação de forma a possibilitar a troca de informação entre eles de maneira a concretizar um objetivo comum. Por norma os sistemas embutidos distribuídos apresentam requisitos temporais bastante exigentes. A tecnologia Ethernet e os protocolos de comunicação, com propriedades de tempo real, desenvolvidos para esta não conseguem associar de uma forma eficaz os requisitos temporais das aplicações de tempo real aos requisitos Quality of Service (QoS) dos diferentes tipos de tráfego. O switch Hard Real-Time Ethernet Switching (HaRTES) foi desenvolvido e implementado com o objetivo de solucionar estes problemas devido às suas capacidades como a sincronização de fluxos diferentes e gestão de diferentes tipos de tráfego. Esta dissertação apresenta a adaptação de um sistemas físico de modo a possibilitar a demonstração do correto funcionamento do sistema de comunicação, que será desenvolvido e implementado, utilizando um switch HaRTES como o elemento responsável pela troca de informação na rede entre os nós. O desempenho da arquitetura de rede desenvolvida será também testada e avaliada.
Resumo:
This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.
Resumo:
Wireless sensor networks (WSNs) are the key enablers of the internet of things (IoT) paradigm. Traditionally, sensor network research has been to be unlike the internet, motivated by power and device constraints. The IETF 6LoWPAN draft standard changes this, defining how IPv6 packets can be efficiently transmitted over IEEE 802.15.4 radio links. Due to this 6LoWPAN technology, low power, low cost micro- controllers can be connected to the internet forming what is known as the wireless embedded internet. Another IETF recommendation, CoAP allows these devices to communicate interactively over the internet. The integration of such tiny, ubiquitous electronic devices to the internet enables interesting real-time applications. This thesis work attempts to evaluate the performance of a stack consisting of CoAP and 6LoWPAN over the IEEE 802.15.4 radio link using the Contiki OS and Cooja simulator, along with the CoAP framework Californium (Cf). Ultimately, the implementation of this stack on real hardware is carried out using a raspberry pi as a border router with T-mote sky sensors as slip radios and CoAP servers relaying temperature and humidity data. The reliability of the stack was also demonstrated during scalability analysis conducted on the physical deployment. The interoperability is ensured by connecting the WSN to the global internet using different hardware platforms supported by Contiki and without the use of specialized gateways commonly found in non IP based networks. This work therefore developed and demonstrated a heterogeneous wireless sensor network stack, which is IP based and conducted performance analysis of the stack, both in terms of simulations and real hardware.
Resumo:
Secure computation involves multiple parties computing a common function while keeping their inputs private, and is a growing field of cryptography due to its potential for maintaining privacy guarantees in real-world applications. However, current secure computation protocols are not yet efficient enough to be used in practice. We argue that this is due to much of the research effort being focused on generality rather than specificity. Namely, current research tends to focus on constructing and improving protocols for the strongest notions of security or for an arbitrary number of parties. However, in real-world deployments, these security notions are often too strong, or the number of parties running a protocol would be smaller. In this thesis we make several steps towards bridging the efficiency gap of secure computation by focusing on constructing efficient protocols for specific real-world settings and security models. In particular, we make the following four contributions: - We show an efficient (when amortized over multiple runs) maliciously secure two-party secure computation (2PC) protocol in the multiple-execution setting, where the same function is computed multiple times by the same pair of parties. - We improve the efficiency of 2PC protocols in the publicly verifiable covert security model, where a party can cheat with some probability but if it gets caught then the honest party obtains a certificate proving that the given party cheated. - We show how to optimize existing 2PC protocols when the function to be computed includes predicate checks on its inputs. - We demonstrate an efficient maliciously secure protocol in the three-party setting.
Resumo:
Ionic liquids (ILs) have attracted great attention, from both industry and academia, as alternative fluids for very different types of applications. The large number of cations and anions allow a wide range of physical and chemical characteristics to be designed. However, the exhaustive measurement of all these systems is impractical, thus requiring the use of a predictive model for their study. In this work, the predictive capability of the conductor-like screening model for real solvents (COSMO-RS), a model based on unimolecular quantum chemistry calculations, was evaluated for the prediction water activity coefficient at infinite dilution, gamma(infinity)(w), in several classes of ILs. A critical evaluation of the experimental and predicted data using COSMO-RS was carried out. The global average relative deviation was found to be 27.2%, indicating that the model presents a satisfactory prediction ability to estimate gamma(infinity)(w) in a broad range of ILs. The results also showed that the basicity of the ILs anions plays an important role in their interaction with water, and it considerably determines the enthalpic behavior of the binary mixtures composed by Its and water. Concerning the cation effect, it is possible to state that generally gamma(infinity)(w) increases with the cation size, but it is shown that the cation-anion interaction strength is also important and is strongly correlated to the anion ability to interact with water. The results here reported are relevant in the understanding of ILs-water interactions and the impact of the various structural features of its on the gamma(infinity)(w) as these allow the development of guidelines for the choice of the most suitable lLs with enhanced interaction with water.
An Approach to Manage Reconfigurations and Reduce Area Cost in Hard Real-Time Reconfigurable Systems
Resumo:
This article presents a methodology to build real-time reconfigurable systems that ensure that all the temporal constraints of a set of applications are met, while optimizing the utilization of the available reconfigurable resources. Starting from a static platform that meets all the real-time deadlines, our approach takes advantage of run-time reconfiguration in order to reduce the area needed while guaranteeing that all the deadlines are still met. This goal is achieved by identifying which tasks must be always ready for execution in order to meet the deadlines, and by means of a methodology that also allows reducing the area requirements.
Resumo:
The study of photophysical and photochemical processes crosses the interest of many fields of research in physics, chemistry and biology. In particular, the photophysical and photochemical reactions, after light absorption by a photosynthetic pigment-protein complex, are among the fastest events in biology, taking place on timescales ranging from tens of femtoseconds to a few nanoseconds. Among the experimental approaches developed for this purpose, the advent of ultrafast transient absorption spectroscopy has become a powerful and widely used technique.[1,2] Focusing on the process of photosynthesis, it relies upon the efficient absorption and conversion of the radiant energy from the Sun. Chlorophylls and carotenoids are the main players in the process. Photosynthetic pigments are typically arranged in a highly organized fashion to constitute antennas and reaction centers, supramolecular devices where light harvesting and charge separation take place. The very early steps in the photosynthetic process take place after the absorption of a photon by an antenna system, which harvests light and eventually delivers it to the reaction center. In order to compete with internal conversion, intersystem crossing, and fluorescence, which inevitably lead to energy loss, the energy and electron transfer processes that fix the excited-state energy in photosynthesis must be extremely fast. In order to investigate these events, ultrafast techniques down to a sub-100 fs resolution must be used. In this way, energy migration within the system as well as the formation of new chemical species such as charge-separated states can be tracked in real time. This can be achieved by making use of ultrafast transient absorption spectroscopy. The basic principles of this notable technique, instrumentation, and some recent applications to photosynthetic systems[3] will be described. Acknowledgements M. Moreno Oliva thanks the MINECO for a “Juan de la Cierva-Incorporación” research contract. References [1] U. Megerle, I. Pugliesi, C. Schriever, C.F. Sailer and E. Riedle, Appl. Phys. B, 96, 215 – 231 (2009). [2] R. Berera, R. van Grondelle and J.T.M. Kennis, Photosynth. Res., 101, 105 – 118 (2009). [3] T. Nikkonen, M. Moreno Oliva, A. Kahnt, M. Muuronen, J. Helaja and D.M. Guldi, Chem. Eur. J., 21, 590 – 600 (2015).
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
Surface Plasmon Resonance (SPR) and localized surface plasmon resonance (LSPR) biosensors have brought a revolutionary change to in vitro study of biological and biochemical processes due to its ability to measure extremely small changes in surface refractive index (RI), binding equilibrium and kinetics. Strategies based on LSPR have been employed to enhance the sensitivity for a variety of applications, such as diagnosis of diseases, environmental analysis, food safety, and chemical threat detection. In LSPR spectroscopy, absorption and scattering of light are greatly enhanced at frequencies that excite the LSPR, resulting in a characteristic extinction spectrum that depends on the RI of the surrounding medium. Compositional and conformational change within the surrounding medium near the sensing surface could therefore be detected as shifts in the extinction spectrum. This dissertation specifically focuses on the development and evaluation of highly sensitive LSPR biosensors for in situ study of biomolecular binding process by incorporating nanotechnology. Compared to traditional methods for biomolecular binding studies, LSPR-based biosensors offer real-time, label free detection. First, we modified the gold sensing surface of LSPR-based biosensors using nanomaterials such as gold nanoparticles (AuNPs) and polymer to enhance surface absorption and sensitivity. The performance of this type of biosensors was evaluated on the application of small heavy metal molecule binding affinity study. This biosensor exhibited ~7 fold sensitivity enhancement and binding kinetics measurement capability comparing to traditional biosensors. Second, a miniaturized cell culture system was integrated into the LSPR-based biosensor system for the purpose of real-time biomarker signaling pathway studies and drug efficacy studies with living cells. To the best of our knowledge, this is the first LSPR-based sensing platform with the capability of living cell studies. We demonstrated the living cell measurement ability by studying the VEGF signaling pathway in living SKOV-3 cells. Results have shown that the VEGF secretion level from SKOV-3 cells is 0.0137 ± 0.0012 pg per cell. Moreover, we have demonstrated bevacizumab drug regulation to the VEGF signaling pathway using this biosensor. This sensing platform could potentially help studying biomolecular binding kinetics which elucidates the underlying mechanisms of biotransportation and drug delivery.
Resumo:
By mixing concepts from both game theoretic analysis and real options theory, an investment decision in a competitive market can be seen as a ‘‘game’’ between firms, as firms implicitly take into account other firms’ reactions to their own investment actions. We review two decades of real option game models, suggesting which critical problems have been ‘‘solved’’ by considering game theory, and which significant problems have not been yet adequately addressed. We provide some insights on the plausible empirical applications, or shortfalls in applications to date, and suggest some promising avenues for future research.
Resumo:
Some organizations end up reimplementing the same class of business process over and over: an "administrative process", which consists of managing a form through several states and involving various roles in the organization. This results in wasted time that could be dedicated to better understanding the process or dealing with the fine details that are specific to the process. Existing virtual office solutions require specific training and infrastructure andmay result in vendor lock-in. In this paper, we propose using a high-level domain-specific language (AdminDSL) to describe the administrative process and a separate code generator targeting a standard web framework. We have implemented the approach using Xtext, EGL and the Django web framework, and we illustrate it through two case studies: a synthetic examination process which illustrates the architecture of the generated code, and a real-world workplace survey process that identified several future avenues for improvement.