956 resultados para Data Structure Evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cluster scheduling and collision avoidance are crucial issues in large-scale cluster-tree Wireless Sensor Networks (WSNs). The paper presents a methodology that provides a Time Division Cluster Scheduling (TDCS) mechanism based on the cyclic extension of RCPS/TC (Resource Constrained Project Scheduling with Temporal Constraints) problem for a cluster-tree WSN, assuming bounded communication errors. The objective is to meet all end-to-end deadlines of a predefined set of time-bounded data flows while minimizing the energy consumption of the nodes by setting the TDCS period as long as possible. Sinceeach cluster is active only once during the period, the end-to-end delay of a given flow may span over several periods when there are the flows with opposite direction. The scheduling tool enables system designers to efficiently configure all required parameters of the IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs in the network design time. The performance evaluation of thescheduling tool shows that the problems with dozens of nodes can be solved while using optimal solvers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this cross-sectional study we analyzed, whether team climate for innovation mediates the relationship between team task structure and innovative behavior, job satisfaction, affective organizational commitment, and work stress. 310 employees in 20 work teams of an automotive company participated in this study. 10 teams had been changed from a restrictive to a more self-regulating team model by providing task variety, autonomy, team-specific goals, and feedback in order to increase team effectiveness. Data support the supposed causal chain, although only with respect to team innovative behavior all required effects were statistically significant. Longitudinal designs and larger samples are needed to prove the assumed causal relationships, but results indicate that implementing self-regulating teams might be an effective strategy for improving innovative behavior and thus team and company effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project was developed within the ART-WiSe framework of the IPP-HURRAY group (http://www.hurray.isep.ipp.pt), at the Polytechnic Institute of Porto (http://www.ipp.pt). The ART-WiSe – Architecture for Real-Time communications in Wireless Sensor networks – framework (http://www.hurray.isep.ipp.pt/art-wise) aims at providing new communication architectures and mechanisms to improve the timing performance of Wireless Sensor Networks (WSNs). The architecture is based on a two-tiered protocol structure, relying on existing standard communication protocols, namely IEEE 802.15.4 (Physical and Data Link Layers) and ZigBee (Network and Application Layers) for Tier 1 and IEEE 802.11 for Tier 2, which serves as a high-speed backbone for Tier 1 without energy consumption restrictions. Within this trend, an application test-bed is being developed with the objectives of implementing, assessing and validating the ART-WiSe architecture. Particularly for the ZigBee protocol case; even though there is a strong commercial lobby from the ZigBee Alliance (http://www.zigbee.org), there is neither an open source available to the community for this moment nor publications on its adequateness for larger-scale WSN applications. This project aims at fulfilling these gaps by providing: a deep analysis of the ZigBee Specification, mainly addressing the Network Layer and particularly its routing mechanisms; an identification of the ambiguities and open issues existent in the ZigBee protocol standard; the proposal of solutions to the previously referred problems; an implementation of a subset of the ZigBee Network Layer, namely the association procedure and the tree routing on our technological platform (MICAz motes, TinyOS operating system and nesC programming language) and an experimental evaluation of that routing mechanism for WSNs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications involving biosignals, such as Electrocardiography (ECG), are becoming more pervasive with the extension towards non-intrusive scenarios helping targeting ambulatory healthcare monitoring, emotion assessment, among many others. In this study we introduce a new type of silver/silver chloride (Ag/AgCl) electrodes based on a paper substrate and produced using an inkjet printing technique. This type of electrodes can increase the potential applications of biosignal acquisition technologies for everyday life use, given that there are several advantages, such as cost reduction and easier recycling, resultant from the approach explored in our work. We performed a comparison study to assess the quality of this new electrode type, in which ECG data was collected with three types of Ag/AgCl electrodes: i) gelled; ii) dry iii) paper-based inkjet printed. We also compared the performance of each electrode when acquired using a professional-grade gold standard device, and a low cost platform. Experimental results showed that data acquired using our proposed inkjet printed electrode is highly correlated with data obtained through conventional electrodes. Moreover, the electrodes are robust to high-end and low-end data acquisition devices. Copyright © 2014 SCITEPRESS - Science and Technology Publications. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To develop a model for evaluating the efficacy of drug-dispensing service in primary health care. METHODS An efficacy criterion was adopted to determine the level of achievement of the service objectives. The evaluation model was developed on the basis of a literature search and discussions with experts. The applicability test of the model was conducted in 15 primary health care units in the city of Florianópolis, state of Santa Catarina, in 2010, and data were recorded in structured and pretested questionnaires. RESULTS The model developed was evaluated using five dimensions of analysis for analysis. The model was suitable for evaluating service efficacy and helped to identify the critical points of each service dimension. CONCLUSIONS Adaptations to the data collection technique may be required to adjust for the reality and needs of each situation. The evaluation of the drug-dispensing service should promote adequate access to medications supplied through the public health system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To assess the validity of dengue fever reports and how they relate to the definition of case and severity. METHODS Diagnostic test assessment was conducted using cross-sectional sampling from a universe of 13,873 patients treated during the fifth epidemiological period in health institutions from 11 Colombian departments in 2013. The test under analyses was the reporting to the National Public Health Surveillance System, and the reference standard was the review of histories identified by active institutional search. We reviewed all histories of patients diagnosed with dengue fever, as well as a random sample of patients with febrile syndromes. The specificity and sensitivity of reports were estimated for this purpose, considering the inverse of the probability of being selected for weighting. The concordance between reporting and the findings of the active institutional search was calculated using Kappa statistics. RESULTS We included 4,359 febrile patients, and 31.7% were classified as compatible with dengue fever (17 with severe dengue fever; 461 with dengue fever and warning signs; 904 with dengue fever and no warning signs). The global sensitivity of reports was 13.2% (95%CI 10.9;15.4) and specificity was 98.4% (95%CI 97.9;98.9). Sensitivity varied according to severity: 12.1% (95%CI 9.3;14.8) for patients presenting dengue fever with no warning signs; 14.5% (95%CI 10.6;18.4) for those presenting dengue fever with warning signs, and 40.0% (95%CI 9.6;70.4) for those with severe dengue fever. Concordance between reporting and the findings of the active institutional search resulted in a Kappa of 10.1%. CONCLUSIONS Low concordance was observed between reporting and the review of clinical histories, which was associated with the low reporting of dengue fever compatible cases, especially milder cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread employment of carbon-epoxy laminates in high responsibility and severely loaded applications introduces an issue regarding their handling after damage. Repair of these structures should be evaluated, instead of their disposal, for cost saving and ecological purposes. Under this perspective, the availability of efficient repair methods is essential to restore the strength of the structure. The development and validation of accurate predictive tools for the repairs behaviour are also extremely important, allowing the reduction of costs and time associated to extensive test programmes. Comparing with strap repairs, scarf repairs have the advantages of a higher efficiency and the absence of aerodynamic disturbance. This work reports on a numerical study of the tensile behaviour of three-dimensional scarf repairs in carbon-epoxy structures, using a ductile adhesive (Araldite® 2015). The finite elements analysis was performed in ABAQUS® and Cohesive Zone Modelling was used for the simulation of damage onset and growth in the adhesive layer. Trapezoidal cohesive laws in each pure mode were used to account for the ductility of the specific adhesive mentioned. A parametric study was performed on the repair width and scarf angle. The use of over-laminating plies covering the repaired region at the outer or both repair surfaces was also tested as an attempt to increase the repairs efficiency. The obtained results allowed the proposal of design principles for repairing composite structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the validity and reliability of an instrument that evaluates the structure of primary health care units for the treatment of tuberculosis.METHODS This cross-sectional study used simple random sampling and evaluated 1,037 health care professionals from five Brazilian municipalities (Natal, state of Rio Grande do Norte; Cabedelo, state of Paraíba; Foz do Iguaçu, state of Parana; Sao José do Rio Preto, state of Sao Paulo, and Uberaba, state of Minas Gerais) in 2011. Structural indicators were identified and validated, considering different methods of organization of the health care system in the municipalities of different population sizes. Each structure represented the organization of health care services and contained the resources available for the execution of health care services: physical resources (equipment, consumables, and facilities); human resources (number and qualification); and resources for maintenance of the existing infrastructure and technology (deemed as the organization of health care services). The statistical analyses used in the validation process included reliability analysis, exploratory factor analysis, and confirmatory factor analysis.RESULTS The validation process indicated the retention of five factors, with 85.9% of the total variance explained, internal consistency between 0.6460 and 0.7802, and quality of fit of the confirmatory factor analysis of 0.995 using the goodness-of-fit index. The retained factors comprised five structural indicators: professionals involved in the care of tuberculosis patients, training, access to recording instruments, availability of supplies, and coordination of health care services with other levels of care. Availability of supplies had the best performance and the lowest coefficient of variation among the services evaluated. The indicators of assessment of human resources and coordination with other levels of care had satisfactory performance, but the latter showed the highest coefficient of variation. The performance of the indicators “training” and “access to recording instruments” was inferior to that of other indicators.CONCLUSIONS The instrument showed feasibility of application and potential to assess the structure of primary health care units for the treatment of tuberculosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laccases are multi-copper oxidases that oxidise a wide range of substrates including phenol and aniline derivatives, which could be further involved in coupling reactions leading to the formation of dimeric and trimeric structures. This paper describes the enzyme-mediated dimerisation of several ortho and meta, para-disubstituted aromatic amines into phenazine ("head-to-tail" dimers) and phenoxazinone chromophores. The redox properties of substituted aromatic amines were studied by cyclic voltammetry and the kinetic constants of CotA and Trametes versicolor laccases were measured for selected aromatic amines. The structure of novel enzymatically synthesised phenazine and phenoxazinone dyes using CotA laccase was assessed by NMR and MS. Overall our data show that this enzymatic green process is an efficient alternative to the classic chemical oxidation of aromatic amines and phenols, with an impact on the broad field of applications of these heterocyclic compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To analyze if size, administrative level, legal status, type of unit and educational activity influence the hospital network performance in providing services to the Brazilian Unified Health System.METHODS This cross-sectional study evaluated data from the Hospital Information System and the Cadastro Nacional de Estabelecimentos de Saúde (National Registry of Health Facilities), 2012, in Sao Paulo, Southeastern Brazil. We calculated performance indicators, such as: the ratio of hospital employees per bed; mean amount paid for admission; bed occupancy rate; average length of stay; bed turnover index and hospital mortality rate. Data were expressed as mean and standard deviation. The groups were compared using analysis of variance (ANOVA) and Bonferroni correction.RESULTS The hospital occupancy rate in small hospitals was lower than in medium, big and special-sized hospitals. Higher hospital occupancy rate and bed turnover index were observed in hospitals that include education in their activities. The hospital mortality rate was lower in specialized hospitals compared to general ones, despite their higher proportion of highly complex admissions. We found no differences between hospitals in the direct and indirect administration for most of the indicators analyzed.CONCLUSIONS The study indicated the importance of the scale effect on efficiency, and larger hospitals had a higher performance. Hospitals that include education in their activities had a higher operating performance, albeit with associated importance of using human resources and highly complex structures. Specialized hospitals had a significantly lower rate of mortality than general hospitals, indicating the positive effect of the volume of procedures and technology used on clinical outcomes. The analysis related to the administrative level and legal status did not show any significant performance differences between the categories of public hospitals.