998 resultados para cluster validation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - To develop and validate a psychometric scale for assessing image quality perception for chest X-ray images. Methods - Bandura's theory was used to guide scale development. A review of the literature was undertaken to identify items/factors which could be used to evaluate image quality using a perceptual approach. A draft scale was then created (22 items) and presented to a focus group (student and qualified radiographers). Within the focus group the draft scale was discussed and modified. A series of seven postero-anterior chest images were generated using a phantom with a range of image qualities. Image quality perception was confirmed for the seven images using signal-to-noise ratio (SNR 17.2–36.5). Participants (student and qualified radiographers and radiology trainees) were then invited to independently score each of the seven images using the draft image quality perception scale. Cronbach alpha was used to test interval reliability. Results - Fifty three participants used the scale to grade image quality perception on each of the seven images. Aggregated mean scale score increased with increasing SNR from 42.1 to 87.7 (r = 0.98, P < 0.001). For each of the 22 individual scale items there was clear differentiation of low, mid and high quality images. A Cronbach alpha coefficient of >0.7 was obtained across each of the seven images. Conclusion - This study represents the first development of a chest image quality perception scale based on Bandura's theory. There was excellent correlation between the image quality perception scores derived using the scale and the SNR. Further research will involve a more detailed item and factor analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new procedure for determining eleven organochlorine pesticides in soils using microwave-assisted extraction (MAE) and headspace solid phase microextraction (HS-SPME) is described. The studied pesticides consisted of mirex, α- and γ-chlordane, p,p’-DDT, heptachlor, heptachlor epoxide isomer A, γ-hexachlorocyclohexane, dieldrin, endrin, aldrine and hexachlorobenzene. The HS-SPME was optimized for the most important parameters such as extraction time, sample volume and temperature. The present analytical procedure requires a reduced volume of organic solvents and avoids the need for extract clean-up steps. For optimized conditions the limits of detection for the method ranged from 0.02 to 3.6 ng/g, intermediate precision ranged from 14 to 36% (as CV%), and the recovery from 8 up to 51%. The proposed methodology can be used in the rapid screening of soil for the presence of the selected pesticides, and was applied to landfill soil samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Firms located within a cluster have access to tacit, complex and specific local knowledge which allow them to develop competitive advantage. However, firms have no equal ability to access and to apply that knowledge, meaning that not all have a similar knowledge absorptive capacity. Using a sample of the largest Portuguese firms within a footwear cluster, this paper examine whether there are significant differences in firm’s absorptive capacity and whether such differences within a cluster are related to firms’ specific characteristics. The results suggest that absorptive capacity is significantly associated with the firms’ characteristics, namely size, export intensity and position within the cluster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Controlo de Gestão e dos Negócios

Relevância:

20.00% 20.00%

Publicador:

Resumo:

WiDom is a wireless prioritized medium access control protocol which offers a very large number of priority levels. Hence, it brings the potential to employ non-preemptive static-priority scheduling and schedulability analysis for a wireless channel assuming that the overhead of WiDom is modeled properly. One schedulability analysis for WiDom has already been proposed but recent research has created a new version of WiDom (we call it: Slotted WiDom) with lower overhead and for this version of WiDom no schedulability analysis exists. In this paper we propose a new schedulability analysis for slotted WiDom and extend it to work also for message streams with release jitter. We have performed experiments with an implementation of slotted WiDom on a real-world platform (MicaZ). We find that for each message stream, the maximum observed response time never exceeds the calculated response time and hence this corroborates our belief that our new scheduling theory is applicable in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scheduling of constrained deadline sporadic task systems on multiprocessor platforms is an area which has received much attention in the recent past. It is widely believed that finding an optimal scheduler is hard, and therefore most studies have focused on developing algorithms with good processor utilization bounds. These algorithms can be broadly classified into two categories: partitioned scheduling in which tasks are statically assigned to individual processors, and global scheduling in which each task is allowed to execute on any processor in the platform. In this paper we consider a third, more general, approach called cluster-based scheduling. In this approach each task is statically assigned to a processor cluster, tasks in each cluster are globally scheduled among themselves, and clusters in turn are scheduled on the multiprocessor platform. We develop techniques to support such cluster-based scheduling algorithms, and also consider properties that minimize total processor utilization of individual clusters. In the last part of this paper, we develop new virtual cluster-based scheduling algorithms. For implicit deadline sporadic task systems, we develop an optimal scheduling algorithm that is neither Pfair nor ERfair. We also show that the processor utilization bound of us-edf{m/(2m−1)} can be improved by using virtual clustering. Since neither partitioned nor global strategies dominate over the other, cluster-based scheduling is a natural direction for research towards achieving improved processor utilization bounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling the fundamental performance limits of Wireless Sensor Networks (WSNs) is of paramount importance to understand their behavior under the worst-case conditions and to make the appropriate design choices. This is particular relevant for time-sensitive WSN applications, where the timing behavior of the network protocols (message transmission must respect deadlines) impacts on the correct operation of these applications. In that direction this paper contributes with a methodology based on Network Calculus, which enables quick and efficient worst-case dimensioning of static or even dynamically changing cluster-tree WSNs where the data sink can either be static or mobile. We propose closed-form recurrent expressions for computing the worst-case end-to-end delays, buffering and bandwidth requirements across any source-destination path in a cluster-tree WSN. We show how to apply our methodology to the case of IEEE 802.15.4/ZigBee cluster-tree WSNs. Finally, we demonstrate the validity and analyze the accuracy of our methodology through a comprehensive experimental study using commercially available technology, namely TelosB motes running TinyOS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cluster scheduling and collision avoidance are crucial issues in large-scale cluster-tree Wireless Sensor Networks (WSNs). The paper presents a methodology that provides a Time Division Cluster Scheduling (TDCS) mechanism based on the cyclic extension of RCPS/TC (Resource Constrained Project Scheduling with Temporal Constraints) problem for a cluster-tree WSN, assuming bounded communication errors. The objective is to meet all end-to-end deadlines of a predefined set of time-bounded data flows while minimizing the energy consumption of the nodes by setting the TDCS period as long as possible. Sinceeach cluster is active only once during the period, the end-to-end delay of a given flow may span over several periods when there are the flows with opposite direction. The scheduling tool enables system designers to efficiently configure all required parameters of the IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs in the network design time. The performance evaluation of thescheduling tool shows that the problems with dozens of nodes can be solved while using optimal solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling the fundamental performance limits of Wireless Sensor Networks (WSNs) is of paramount importance to understand their behavior under worst-case conditions and to make the appropriate design choices. In that direction this paper contributes with an analytical methodology for modeling cluster-tree WSNs where the data sink can either be static or mobile. We assess the validity and pessimism of analytical model by comparing the worst-case results with the values measured through an experimental test-bed based on Commercial-Off- The-Shelf (COTS) technologies, namely TelosB motes running TinyOS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synchronization is a challenging and important issue for time-sensitive Wireless Sensor Networks (WSN) since it requires a mutual spatiotemporal coordination between the nodes. In that concern, the IEEE 802.15.4/ZigBee protocols embody promising technologies for WSNs, but are still ambiguous on how to efficiently build synchronized multiple-cluster networks, specifically for the case of cluster-tree topologies. In fact, the current IEEE 802.15.4/ZigBee specifications restrict the synchronization to beacon-enabled (by the generation of periodic beacon frames) star networks, while they support multi-hop networking in mesh topologies, but with no synchronization. Even though both specifications mention the possible use of cluster-tree topologies, which combine multi-hop and synchronization features, the description on how to effectively construct such a network topology is missing. This paper tackles this issue by unveiling the ambiguities regarding the use of the cluster-tree topology and proposing a synchronization mechanism based on Time Division Beacon Scheduling (TDBS) to build cluster-tree WSNs. In addition, we propose a methodology for efficiently managing duty-cycles in every cluster, ensuring the fairest use of bandwidth resources. The feasibility of the TDBS mechanism is clearly demonstrated through an experimental test-bed based on our open-source implementation of the IEEE 802.15.4/ZigBee protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The simulation analysis is important approach to developing and evaluating the systems in terms of development time and cost. This paper demonstrates the application of Time Division Cluster Scheduling (TDCS) tool for the configuration of IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs using the simulation analysis, as an illustrative example that confirms the practical applicability of the tool. The simulation study analyses how the number of retransmissions impacts the reliability of data transmission, the energy consumption of the nodes and the end-to-end communication delay, based on the simulation model that was implemented in the Opnet Modeler. The configuration parameters of the network are obtained directly from the TDCS tool. The simulation results show that the number of retransmissions impacts the reliability, the energy consumption and the end-to-end delay, in a way that improving the one may degrade the others.