965 resultados para Web-Assisted Error Detection
Resumo:
Identification and classification of overlapping nodes in networks are important topics in data mining. In this paper, a network-based (graph-based) semi-supervised learning method is proposed. It is based on competition and cooperation among walking particles in a network to uncover overlapping nodes by generating continuous-valued outputs (soft labels), corresponding to the levels of membership from the nodes to each of the communities. Moreover, the proposed method can be applied to detect overlapping data items in a data set of general form, such as a vector-based data set, once it is transformed to a network. Usually, label propagation involves risks of error amplification. In order to avoid this problem, the proposed method offers a mechanism to identify outliers among the labeled data items, and consequently prevents error propagation from such outliers. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method. © 2012 Springer-Verlag.
Resumo:
Structural damage identification is basically a nonlinear phenomenon; however, nonlinear procedures are not used currently in practical applications due to the complexity and difficulty for implementation of such techniques. Therefore, the development of techniques that consider the nonlinear behavior of structures for damage detection is a research of major importance since nonlinear dynamical effects can be erroneously treated as damage in the structure by classical metrics. This paper proposes the discrete-time Volterra series for modeling the nonlinear convolution between the input and output signals in a benchmark nonlinear system. The prediction error of the model in an unknown structural condition is compared with the values of the reference structure in healthy condition for evaluating the method of damage detection. Since the Volterra series separate the response of the system in linear and nonlinear contributions, these indexes are used to show the importance of considering the nonlinear behavior of the structure. The paper concludes pointing out the main advantages and drawbacks of this damage detection methodology. © (2013) Trans Tech Publications.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
iCONVERT: an integrated device for the UV-assisted determination of H2S via mid-infrared gas sensors
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Bovine tuberculosis (BTB) was introduced into Swedish farmed deer herds in 1987. Epidemiological investigations showed that 10 deer herds had become infected (July 1994) and a common source of infection, a consignment of 168 imported farmed fallow deer, was identified (I). As trace-back of all imported and in-contact deer was not possible, a control program, based on tuberculin testing, was implemented in July 1994. As Sweden has been free from BTB since 1958, few practicing veterinarians had experience in tuberculin testing. In this test, result relies on the skill, experience and conscientiousness of the testing veterinarian. Deficiencies in performing the test may adversely affect the test results and thereby compromise a control program. Quality indicators may identify possible deficiencies in testing procedures. For that purpose, reference values for measured skin fold thickness (prior to injection of the tuberculin) were established (II) suggested to be used mainly by less experienced veterinarians to identify unexpected measurements. Furthermore, the within-veterinarian variation of the measured skin fold thickness was estimated by fitting general linear models to data (skin fold measurements) (III). The mean square error was used as an estimator of the within-veterinarian variation. Using this method, four (6%) veterinarians were considered to have unexpectedly large variation in measurements. In certain large extensive deer farms, where mustering of all animals was difficult, meat inspection was suggested as an alternative to tuberculin testing. The efficiency of such a control was estimated in paper IV and V. A Reed Frost model was fitted to data from seven BTB-infected deer herds and the spread of infection was estimated (< 0.6 effective contacts per deer and year) (IV). These results were used to model the efficiency of meat inspection in an average extensive Swedish deer herd. Given a 20% annual slaughter and meat inspection, the model predicted that BTB would be either detected or eliminated in most herds (90%) 15 years after introduction of one infected deer. In 2003, an alternative control for BTB in extensive Swedish deer herds, based on the results of paper V, was implemented.
Resumo:
End-user programmers are increasingly relying on web authoring environments to create web applications. Although often consisting primarily of web pages, such applications are increasingly going further, harnessing the content available on the web through “programs” that query other web applications for information to drive other tasks. Unfortunately, errors can be pervasive in web applications, impacting their dependability. This paper reports the results of an exploratory study of end-user web application developers, performed with the aim of exposing prevalent classes of errors. The results suggest that end-users struggle the most with the identification and manipulation of variables when structuring requests to obtain data from other web sites. To address this problem, we present a family of techniques that help end user programmers perform this task, reducing possible sources of error. The techniques focus on simplification and characterization of the data that end-users must analyze while developing their web applications. We report the results of an empirical study in which these techniques are applied to several popular web sites. Our results reveal several potential benefits for end-users who wish to “engineer” dependable web applications.
Resumo:
In the clinical setting, the early detection of myocardial injury induced by doxorubicin (DXR) is still considered a challenge. To assess whether ultrasonic tissue characterization (UTC) can identify early DXR-related myocardial lesions and their correlation with collagen myocardial percentages, we studied 60 rats at basal status and prospectively after 2mg/Kg/week DXR endovenous infusion. Echocardiographic examinations were conducted at baseline and at 8,10,12,14 and 16 mg/Kg DXR cumulative dose. The left ventricle ejection fraction (LVEF), shortening fraction (SF), and the UTC indices: corrected coefficient of integrated backscatter (IBS) (tissue IBS intensity/phantom IBS intensity) (CC-IBS) and the cyclic variation magnitude of this intensity curve (MCV) were measured. The variation of each parameter of study through DXR dose was expressed by the average and standard error at specific DXR dosages and those at baseline. The collagen percent (%) was calculated in six control group animals and 24 DXR group animals. CC-IBS increased (1.29 +/- 0.27 x 1.1 +/- 0.26-basal; p=0.005) and MCV decreased (9.1 +/- 2.8 x 11.02 +/- 2.6-basal; p=0.006) from 8 mg/Kg to 16mg/Kg DXR. LVEF presented only a slight but significant decrease (80.4 +/- 6.9% x 85.3 +/- 6.9%-basal, p=0.005) from 8 mg/Kg to 16 mg/Kg DXR. CC-IBS was 72.2% sensitive and 83.3% specific to detect collagen deposition of 4.24%(AUC=0.76). LVEF was not accurate to detect initial collagen deposition (AUC=0.54). In conclusion: UTC was able to early identify the DXR myocardial lesion when compared to LVEF, showing good accuracy to detect the initial collagen deposition in this experimental animal model.
Resumo:
The identification of quantitative trait loci (QTL) and marker-assisted selection with a view to breeding programs have aroused great interest, including for cashew improvement. This study identified QTL for yield-related traits: nut weight, male and hermaphrodite flowers. The traits were evaluated in 71 F-1 genotypes of the cross CCP 1001 x CP 96. The methods of interval mapping and multiple QTL mapping were applied to identify QTL. Eleven QTL were detected: three for nut weight, four for male flowers and four for hermaphrodite flowers. The QTL accounted for 3.79 to 12.98 % of the total phenotypic variance and had phenotypic effects of -31.81 to 34.25 %. The potential for marker-assisted selection of the QTL hf-2f and hf-3m is great and the phenotypic effects and percentage of phenotypic variation higher than of the others.
Resumo:
[EN] The aortic dissection is a disease that can cause a deadly situation, even with a correct treatment. It consists in a rupture of a layer of the aortic artery wall, causing a blood flow inside this rupture, called dissection. The aim of this paper is to contribute to its diagnosis, detecting the dissection edges inside the aorta. A subpixel accuracy edge detector based on the hypothesis of partial volume effect is used, where the intensity of an edge pixel is the sum of the contribution of each color weighted by its relative area inside the pixel. The method uses a floating window centred on the edge pixel and computes the edge features. The accuracy of our method is evaluated on synthetic images of different hickness and noise levels, obtaining an edge detection with a maximal mean error lower than 16 percent of a pixel.
Resumo:
Recent progress in microelectronic and wireless communications have enabled the development of low cost, low power, multifunctional sensors, which has allowed the birth of new type of networks named wireless sensor networks (WSNs). The main features of such networks are: the nodes can be positioned randomly over a given field with a high density; each node operates both like sensor (for collection of environmental data) as well as transceiver (for transmission of information to the data retrieval); the nodes have limited energy resources. The use of wireless communications and the small size of nodes, make this type of networks suitable for a large number of applications. For example, sensor nodes can be used to monitor a high risk region, as near a volcano; in a hospital they could be used to monitor physical conditions of patients. For each of these possible application scenarios, it is necessary to guarantee a trade-off between energy consumptions and communication reliability. The thesis investigates the use of WSNs in two possible scenarios and for each of them suggests a solution that permits to solve relating problems considering the trade-off introduced. The first scenario considers a network with a high number of nodes deployed in a given geographical area without detailed planning that have to transmit data toward a coordinator node, named sink, that we assume to be located onboard an unmanned aerial vehicle (UAV). This is a practical example of reachback communication, characterized by the high density of nodes that have to transmit data reliably and efficiently towards a far receiver. It is considered that each node transmits a common shared message directly to the receiver onboard the UAV whenever it receives a broadcast message (triggered for example by the vehicle). We assume that the communication channels between the local nodes and the receiver are subject to fading and noise. The receiver onboard the UAV must be able to fuse the weak and noisy signals in a coherent way to receive the data reliably. It is proposed a cooperative diversity concept as an effective solution to the reachback problem. In particular, it is considered a spread spectrum (SS) transmission scheme in conjunction with a fusion center that can exploit cooperative diversity, without requiring stringent synchronization between nodes. The idea consists of simultaneous transmission of the common message among the nodes and a Rake reception at the fusion center. The proposed solution is mainly motivated by two goals: the necessity to have simple nodes (to this aim we move the computational complexity to the receiver onboard the UAV), and the importance to guarantee high levels of energy efficiency of the network, thus increasing the network lifetime. The proposed scheme is analyzed in order to better understand the effectiveness of the approach presented. The performance metrics considered are both the theoretical limit on the maximum amount of data that can be collected by the receiver, as well as the error probability with a given modulation scheme. Since we deal with a WSN, both of these performance are evaluated taking into consideration the energy efficiency of the network. The second scenario considers the use of a chain network for the detection of fires by using nodes that have a double function of sensors and routers. The first one is relative to the monitoring of a temperature parameter that allows to take a local binary decision of target (fire) absent/present. The second one considers that each node receives a decision made by the previous node of the chain, compares this with that deriving by the observation of the phenomenon, and transmits the final result to the next node. The chain ends at the sink node that transmits the received decision to the user. In this network the goals are to limit throughput in each sensor-to-sensor link and minimize probability of error at the last stage of the chain. This is a typical scenario of distributed detection. To obtain good performance it is necessary to define some fusion rules for each node to summarize local observations and decisions of the previous nodes, to get a final decision that it is transmitted to the next node. WSNs have been studied also under a practical point of view, describing both the main characteristics of IEEE802:15:4 standard and two commercial WSN platforms. By using a commercial WSN platform it is realized an agricultural application that has been tested in a six months on-field experimentation.
Resumo:
[EN]This work makes an extensive experimental study of smile detection testing the Local Binary Patterns (LBP) combined with self similarity (LAC) as main descriptors of the image, along with the powerful Support Vector Machines classifier. Results show that error rates can be acceptable and the self similarity approach for the detection of smiles is suitable for real-time interaction, although there is still room for improvement.
Resumo:
Context-aware computing is currently considered the most promising approach to overcome information overload and to speed up access to relevant information and services. Context-awareness may be derived from many sources, including user profile and preferences, network information, sensor analysis; usually context-awareness relies on the ability of computing devices to interact with the physical world, i.e. with the natural and artificial objects hosted within the "environment”. Ideally, context-aware applications should not be intrusive and should be able to react according to user’s context, with minimum user effort. Context is an application dependent multidimensional space and the location is an important part of it since the very beginning. Location can be used to guide applications, in providing information or functions that are most appropriate for a specific position. Hence location systems play a crucial role. There are several technologies and systems for computing location to a vary degree of accuracy and tailored for specific space model, i.e. indoors or outdoors, structured spaces or unstructured spaces. The research challenge faced by this thesis is related to pedestrian positioning in heterogeneous environments. Particularly, the focus will be on pedestrian identification, localization, orientation and activity recognition. This research was mainly carried out within the “mobile and ambient systems” workgroup of EPOCH, a 6FP NoE on the application of ICT to Cultural Heritage. Therefore applications in Cultural Heritage sites were the main target of the context-aware services discussed. Cultural Heritage sites are considered significant test-beds in Context-aware computing for many reasons. For example building a smart environment in museums or in protected sites is a challenging task, because localization and tracking are usually based on technologies that are difficult to hide or harmonize within the environment. Therefore it is expected that the experience made with this research may be useful also in domains other than Cultural Heritage. This work presents three different approaches to the pedestrian identification, positioning and tracking: Pedestrian navigation by means of a wearable inertial sensing platform assisted by the vision based tracking system for initial settings an real-time calibration; Pedestrian navigation by means of a wearable inertial sensing platform augmented with GPS measurements; Pedestrian identification and tracking, combining the vision based tracking system with WiFi localization. The proposed localization systems have been mainly used to enhance Cultural Heritage applications in providing information and services depending on the user’s actual context, in particular depending on the user’s location.