889 resultados para systems engineering
Resumo:
Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.
Resumo:
This paper proposes an evolutionary computing strategy to solve the problem of fault indicator (FI) placement in primary distribution feeders. More specifically, a genetic algorithm (GA) is employed to search for an efficient configuration of FIs, located at the best positions on the main feeder of a real-life distribution system. Thus, the problem is modeled as one of optimization, aimed at improving the distribution reliability indices, while, at the same time, finding the least expensive solution. Based on actual data, the results confirm the efficiency of the GA approach to the FI placement problem.
Resumo:
In this work, a method of computing PD stabilising gains for rotating systems is presented based on the D-decomposition technique, which requires the sole knowledge of frequency response functions. By applying this method to a rotating system with electromagnetic actuators, it is demonstrated that the stability boundary locus in the plane of feedback gains can be easily plotted, and the most suitable gains can be found to minimise the resonant peak of the system. Experimental results for a Laval rotor show the feasibility of not only controlling lateral shaft vibration and assuring stability, but also helps in predicting the final vibration level achieved by the closed-loop system. These results are obtained based solely on the input-output response information of the system as a whole.
Resumo:
Metadata is data that fully describes the data and the areas they represent, allowing the user to decide on their use as best as possible. Allow reporting on the existence of a set of data linked to specific needs. The use of metadata has the purpose of documenting and organizing a structured organizational data in order to minimize duplication of efforts to locate them and to facilitate maintenance. It also provides the administration of large amounts of data, discovery, retrieval and editing features. The global use of metadata is regulated by a technical group or task force composed of several segments such as industries, universities and research firms. Agriculture in particular is a good example for the development of typical applications using metadata is the integration of systems and equipment, allowing the implementation of techniques used in precision agriculture, the integration of different computer systems via webservices or other type of solution requires the integration of structured data. The purpose of this paper is to present an overview of the standards of metadata areas consolidated as agricultural.
Resumo:
Access control is a key component of security in any computer system. In the last two decades, the research on Role Basead Access Control Models was intense. One of the most important components of a Role Based Model is the Role-Permission Relationship. In this paper, the technique of systematic mapping is used to identify, extract and analyze many approaches applied to establish the Role-Permission Relationship. The main goal of this mapping is pointing directions of significant research in the area of Role Based Access Control Models.
Resumo:
Telecommunications have been in constant evolution during past decades. Among the technological innovations, the use of digital technologies is very relevant. Digital communication systems have proven their efficiency and brought a new element in the chain of signal transmitting and receiving, the digital processor. This device offers to new radio equipments the flexibility of a programmable system. Nowadays, the behavior of a communication system can be modified by simply changing its software. This gave rising to a new radio model called Software Defined Radio (or Software-Defined Radio - SDR). In this new model, one moves to the software the task to set radio behavior, leaving to hardware only the implementation of RF front-end. Thus, the radio is no longer static, defined by their circuits and becomes a dynamic element, which may change their operating characteristics, such as bandwidth, modulation, coding rate, even modified during runtime according to software configuration. This article aims to present the use of GNU Radio software, an open-source solution for SDR specific applications, as a tool for development configurable digital radio.
Resumo:
The need for increasing the loading capacity of transmission lines in a traditional way, by replacing or reinforcement of the structures and foundations on routes crossing areas considered of permanent environmental preservation, may require additional works that alter the environment. The present rigorous environmental legislation turns these changes and substitution unfeasible. One way to increase the capacity of these lines is the use of new conductor technology. The aim of this paper is to discuss the needs for upgrading a transmission line and minimize or eliminate the damage to the environment by using special conductors. Because the aluminum conductor composite reinforced technology is relatively new and considering the lack of information related to its effective performance in practical system, there is a need to verify the behavior of these conductors through monitoring procedures.
Resumo:
The ALRED construction is a lightweight strategy for constructing message authentication algorithms from an underlying iterated block cipher. Even though this construction's original analyses show that it is secure against some attacks, the absence of formal security proofs in a strong security model still brings uncertainty on its robustness. In this paper, aiming to give a better understanding of the security level provided by different authentication algorithms based on this design strategy, we formally analyze two ALRED variants-the MARVIN message authentication code and the LETTERSOUP authenticated-encryption scheme,-bounding their security as a function of the attacker's resources and of the underlying cipher's characteristics.
Resumo:
There is a wide range of telecommunications services that transmit voice, video and data through complex transmission networks and in some cases, the service has not an acceptable quality level for the end user. In this sense the study of methods for assessing video quality and voice have a very important role. This paper presents a classification scheme, based on different criteria, of the methods and metrics that are being studied in recent years. This paper presents how the video quality is affected by degradation in the transmission channel in two kinds of services: Digital TV (ISDB-TB) due the fading in the air interface and video streaming service on an IP network due packet loss. For Digital TV tests was set up a scenario where the digital TV transmitter is connected to an RF channel emulator, where are inserted different fading models and at the end, the videos are saved in a mobile device. The tests of streaming video were performed in an isolated scenario of IP network, which are scheduled several network conditions, resulting in different qualities of video reception. The video quality assessment is performed using objective assessment methods: PSNR, SSIM and VQM. The results show how the losses in the transmission channel affects the quality of end-user experience on both services studied.
Resumo:
We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
A JME-compliant cryptographic library for mobile application development is introduced in this paper. The library allows cryptographic protocols implementation over elliptic curves with different security levels and offers symmetric and asymmetric bilinear pairings operations, as Tate, Weil, and Ate pairings.
Resumo:
The Primary Care Information System (SIAB) concentrates basic healthcare information from all different regions of Brazil. The information is collected by primary care teams on a paper-based procedure that degrades the quality of information provided to the healthcare authorities and slows down the process of decision making. To overcome these problems we propose a new data gathering application that uses a mobile device connected to a 3G network and a GPS to be used by the primary care teams for collecting the families' data. A prototype was developed in which a digital version of one SIAB form is made available at the mobile device. The prototype was tested in a basic healthcare unit located in a suburb of Sao Paulo. The results obtained so far have shown that the proposed process is a better alternative for data collecting at primary care, both in terms of data quality and lower deployment time to health care authorities.
Resumo:
Failure detection is at the core of most fault tolerance strategies, but it often depends on reliable communication. We present new algorithms for failure detectors which are appropriate as components of a fault tolerance system that can be deployed in situations of adverse network conditions (such as loosely connected and administered computing grids). It packs redundancy into heartbeat messages, thereby improving on the robustness of the traditional protocols. Results from experimental tests conducted in a simulated environment with adverse network conditions show significant improvement over existing solutions.
Resumo:
Background: There are no available statistical data about sudden cardiac death in Brazil. Therefore, this study has been conducted to evaluate the incidence of sudden cardiac death in our population and its implications. Methods: The research methodology was based on Thurstone's Law of Comparative Judgment, whose premise is that the more an A stimulus differs from a B stimulus, the greater will be the number of people who will perceive this difference. This technique allows an estimation of actual occurrences from subjective perceptions, when compared to official statistics. Data were collected through telephone interviews conducted with Primary and Secondary Care physicians of the Public Health Service in the Metropolitan Area of Sao Paulo (MASP). Results: In the period from October 19, 2009, to October 28, 2009, 196 interviews were conducted. The incidence of 21,270 cases of sudden cardiac death per year was estimated by linear regression analysis of the physicians responses and data from the Mortality Information System of the Brazilian Ministry of Health, with the following correlation and determination coefficients: r = 0.98 and r2= 0.95 (95% confidence interval 0.81.0, P < 0.05). The lack of waiting list for specialized care and socioadministrative problems were considered the main barriers to tertiary care access. Conclusions: The incidence of sudden cardiac death in the MASP is high, and it was estimated as being higher than all other causes of deaths; the extrapolation technique based on the physicians perceptions was validated; and the most important bureaucratic barriers to patient referral to tertiary care have been identified. (PACE 2012; 35:13261331)
Resumo:
Bromelain is an aqueous extract of pineapple that contains a complex mixture of proteases and non-protease components. These enzymes perform an important role in proteolytic modulation of the cellular matrix in numerous physiologic processes, including anti-inflammatory, anti-thrombotic and fibrinolytic functions. Due to the scale of global production of pineapple (Ananas comosus L.), and the high percentage of waste generated in their cultivation and processing, several studies have been conducted on the recovery of bromelain. The aim of this study was to purify bromelain from pineapple wastes using an easy-to-scale-up process of precipitation by ethanol. The results showed that bromelain was recovered by using ethanol at concentrations of 30% and 70%, in which a purification factor of 2.28 fold was achieved, and yielded more than 98% of the total enzymatic activity. This enzyme proved to be susceptible to denaturation after the lyophilization process. However, by using 10% (w/v) glucose as a cryoprotector, it was possible to preserve 90% of the original enzymatic activity. The efficiency of the purification process was confirmed by SDS-PAGE, and native-PAGE electrophoresis, fluorimetry, circular dichroism and FTIR analyzes, showing that this method could be used to obtain highly purified and structurally stable bromelain. (C) 2012 Elsevier B.V. All rights reserved.