897 resultados para Tool development
Resumo:
By the end of 2004, the Canadian swine population had experienced a severe 2 increase in the incidence of Porcine circovirus-associated disease (PCVAD), a problem that was 3 associated with the emergence of a new Porcine circovirus-2 genotype (PCV-2b), previously 4 unrecovered in North America. Thus it became important to develop a diagnostic tool that could 5 differentiate between the old and new circulating genotypes (PCV-2a and -2b, respectively). 6 Consequently, a multiplex real-time quantitative polymerase chain reaction (mrtqPCR) assay that 7 could sensitively and specifically identify and differentiate PCV-2 genotypes was developed. A 8 retrospective epidemiological survey that used the mrtqPCR assay was performed to determine if 9 cofactors could affect the risk of PCVAD. From 121 PCV-2–positive cases gathered for this 10 study, 4.13%, 92.56% and 3.31% were positive for PCV-2a, PCV-2b, and both genotypes, 11 respectively. In a data analysis using univariate logistic regressions, PCVAD compatible 12 (PCVAD/c) score was significantly associated with the presence of Porcine reproductive and 13 respiratory syndrome virus (PRRSV), PRRSV viral load, PCV-2 viral load, and PCV-2 14 immunohistochemistry (IHC) results. Polytomous logistic regression analysis revealed that 15 PCVAD/c score was affected by PCV-2 viral load (P = 0.0161) and IHC (P = 0.0128), but not by 16 the PRRSV variables (P > 0.9); suggesting that mrtqPCR in tissue is a reliable alternative to IHC. 17 Logistic regression analyses revealed that PCV-2 increased the odds ratio of isolating 2 major 18 swine pathogens of the respiratory tract, Actinobacillus pleuropneumoniae and Streptococcus 19 suis serotypes 1/2, 1, 2, 3, 4, and 7, which are serotypes commonly associated with clinical 20 diseases.
Resumo:
During 1990's the Wavelet Transform emerged as an important signal processing tool with potential applications in time-frequency analysis and non-stationary signal processing.Wavelets have gained popularity in broad range of disciplines like signal/image compression, medical diagnostics, boundary value problems, geophysical signal processing, statistical signal processing,pattern recognition,underwater acoustics etc.In 1993, G. Evangelista introduced the Pitch- synchronous Wavelet Transform, which is particularly suited for pseudo-periodic signal processing.The work presented in this thesis mainly concentrates on two interrelated topics in signal processing,viz. the Wavelet Transform based signal compression and the computation of Discrete Wavelet Transform. A new compression scheme is described in which the Pitch-Synchronous Wavelet Transform technique is combined with the popular linear Predictive Coding method for pseudo-periodic signal processing. Subsequently,A novel Parallel Multiple Subsequence structure is presented for the efficient computation of Wavelet Transform. Case studies also presented to highlight the potential applications.
Resumo:
This thesis describes the development and analysis of an Isosceles Trapezoidal Dielectric Resonator Antenna (ITDRA) by realizing different DR orientations with suitable feed configurations enabling it to be used as multiband, dual band dual polarized and wideband applications. The motivation for this work has been inspired by the need for compact, high efficient, low cost antenna suitable for multi band application, dual band dual polarized operation and broadband operation with the possibility of using with MICs, and to ensure less expensive, more efficient and quality wireless communication systems. To satisfy these challenging demands a novel shaped Dielectric Resonator (DR) is fabricated and investigated for the possibility of above required properties by trying out different orientations of the DR on a simple microstrip feed and with slotted ground plane as well. The thesis initially discusses and evaluates recent and past developments taken place within the microwave industry on this topic through a concise review of literature. Then the theoretical aspects of DRA and different feeding techniques are described. Following this, fabrication and characterization of DRA is explained. To achieve the desired requirements as above both simulations and experimental measurements were undertaken. A 3-D finite element method (FEM) electromagnetic simulation tool, HFSSTM by Agilent, is used to determine the optimum geometry of the dielectric resonator. It was found to be useful in producing approximate results although it had some limitations. A numerical analysis technique, finite difference time domain (FDTD) is used for validating the results of wide band design at the end. MATLAB is used for modeling the ITDR and implementing FDTD analysis. In conclusion this work offers a new, efficient and relatively simple alternative for antennas to be used for multiple requirements in the wireless communication system.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
The basic concepts of digital signal processing are taught to the students in engineering and science. The focus of the course is on linear, time invariant systems. The question as to what happens when the system is governed by a quadratic or cubic equation remains unanswered in the vast majority of literature on signal processing. Light has been shed on this problem when John V Mathews and Giovanni L Sicuranza published the book Polynomial Signal Processing. This book opened up an unseen vista of polynomial systems for signal and image processing. The book presented the theory and implementations of both adaptive and non-adaptive FIR and IIR quadratic systems which offer improved performance than conventional linear systems. The theory of quadratic systems presents a pristine and virgin area of research that offers computationally intensive work. Once the area of research is selected, the next issue is the choice of the software tool to carry out the work. Conventional languages like C and C++ are easily eliminated as they are not interpreted and lack good quality plotting libraries. MATLAB is proved to be very slow and so do SCILAB and Octave. The search for a language for scientific computing that was as fast as C, but with a good quality plotting library, ended up in Python, a distant relative of LISP. It proved to be ideal for scientific computing. An account of the use of Python, its scientific computing package scipy and the plotting library pylab is given in the appendix Initially, work is focused on designing predictors that exploit the polynomial nonlinearities inherent in speech generation mechanisms. Soon, the work got diverted into medical image processing which offered more potential to exploit by the use of quadratic methods. The major focus in this area is on quadratic edge detection methods for retinal images and fingerprints as well as de-noising raw MRI signals
Resumo:
Solid waste management nowadays is an important environmental issue in country like India. Statistics show that there has been substantial increase in the solid waste generation especially in the urban areas. This trend can be ascribed to rapid population growth, changing lifestyles, food habits, and change in living standards, lack of financial resources, institutional weaknesses, improper choice of technology and public apathy towards municipal solid waste. Waste is directly related to the consumption of resources and dumping to the land. Ecological footprint analysis – an impact assessment environment management tool makes a relationship between two factors- the amount of land required to dispose per capita generated waste. Ecological footprint analysis is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis with special emphasis to waste generation. The paper also discusses and analyses the waste footprint of Kochi city,India. An attempt is also made to suggest strategies to reduce the waste footprint thereby making the city sustainable, greener and cleaner
Resumo:
Kochi, the commercial capital of Kerala and the second most important city next to Mumbai on the Western coast of India, is a land having a wide variety of residential environments. The present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. This trend can be ascribed to rapid population growth, our changing lifestyles, food habits, and change in living standards, institutional weaknesses, improper choice of technology and public apathy. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. This paper analyses the scope of EFA as a sustainable environmental management tool for Kochi City
Resumo:
TOSCANA is a graphical tool that supports the human-centered interactive processes of conceptual knowledge processing. The generality of the approach makes TOSCANA a universal tool applicable to a variety of domains. Only the so-called conceptual scales have to be designed for new applications. The presentation shows how the use of abstract scales allows the reuse of formerly defined conceptual scales. Furthermore it describes how thesauri and conceptual taxonomies can be integrated in the generation of conceptual scales.
Resumo:
Conceptual Information Systems are based on a formalization of the concept of "concept" as it is discussed in traditional philosophical logic. This formalization supports a human-centered approach to the development of Information Systems. We discuss this approach by means of an implemented Conceptual Information System for supporting IT security management in companies and organizations.
Resumo:
Summary: Productivity and forage quality of legume-grass swards are important factors for successful arable farming in both organic and conventional farming systems. For these objectives the botanical composition of the swards is of particular importance, especially, the content of legumes due to their ability to fix airborne nitrogen. As it can vary considerably within a field, a non-destructive detection method while doing other tasks would facilitate a more targeted sward management and could predict the nitrogen supply of the soil for the subsequent crop. This study was undertaken to explore the potential of digital image analysis (DIA) for a non destructive prediction of legume dry matter (DM) contribution of legume-grass mixtures. For this purpose an experiment was conducted in a greenhouse, comprising a sample size of 64 experimental swards such as pure swards of red clover (Trifolium pratense L.), white clover (Trifolium repens L.) and lucerne (Medicago sativa L.) as well as binary mixtures of each legume with perennial ryegrass (Lolium perenne L.). Growth stages ranged from tillering to heading and the proportion of legumes from 0 to 80 %. Based on digital sward images three steps were considered in order to estimate the legume contribution (% of DM): i) The development of a digital image analysis (DIA) procedure in order to estimate legume coverage (% of area). ii) The description of the relationship between legume coverage (% area) and legume contribution (% of DM) derived from digital analysis of legume coverage related to the green area in a digital image. iii) The estimation of the legume DM contribution with the findings of i) and ii). i) In order to evaluate the most suitable approach for the estimation of legume coverage by means of DIA different tools were tested. Morphological operators such as erode and dilate support the differentiation of objects of different shape by shrinking and dilating objects (Soille, 1999). When applied to digital images of legume-grass mixtures thin grass leaves were removed whereas rounder clover leaves were left. After this process legume leaves were identified by threshold segmentation. The segmentation of greyscale images turned out to be not applicable since the segmentation between legumes and bare soil failed. The advanced procedure comprising morphological operators and HSL colour information could determine bare soil areas in young and open swards very accurately. Also legume specific HSL thresholds allowed for precise estimations of legume coverage across a wide range from 11.8 - 72.4 %. Based on this legume specific DIA procedure estimated legume coverage showed good correlations with the measured values across the whole range of sward ages (R2 0.96, SE 4.7 %). A wide range of form parameters (i.e. size, breadth, rectangularity, and circularity of areas) was tested across all sward types, but none did improve prediction accuracy of legume coverage significantly. ii) Using measured reference data of legume coverage and contribution, in a first approach a common relationship based on all three legumes and sward ages of 35, 49 and 63 days was found with R2 0.90. This relationship was improved by a legume-specific approach of only 49- and 63-d old swards (R2 0.94, 0.96 and 0.97 for red clover, white clover, and lucerne, respectively) since differing structural attributes of the legume species influence the relationship between these two parameters. In a second approach biomass was included in the model in order to allow for different structures of swards of different ages. Hence, a model was developed, providing a close look on the relationship between legume coverage in binary legume-ryegrass communities and the legume contribution: At the same level of legume coverage, legume contribution decreased with increased total biomass. This phenomenon may be caused by more non-leguminous biomass covered by legume leaves at high levels of total biomass. Additionally, values of legume contribution and coverage were transformed to the logit-scale in order to avoid problems with heteroscedasticity and negative predictions. The resulting relationships between the measured legume contribution and the calculated legume contribution indicated a high model accuracy for all legume species (R2 0.93, 0.97, 0.98 with SE 4.81, 3.22, 3.07 % of DM for red clover, white clover, and lucerne swards, respectively). The validation of the model by using digital images collected over field grown swards with biomass ranges considering the scope of the model shows, that the model is able to predict legume contribution for most common legume-grass swards (Frame, 1992; Ledgard and Steele, 1992; Loges, 1998). iii) An advanced procedure for the determination of legume DM contribution by DIA is suggested, which comprises the inclusion of morphological operators and HSL colour information in the analysis of images and which applies an advanced function to predict legume DM contribution from legume coverage by considering total sward biomass. Low residuals between measured and calculated values of legume dry matter contribution were found for the separate legume species (R2 0.90, 0.94, 0.93 with SE 5.89, 4.31, 5.52 % of DM for red clover, white clover, and lucerne swards, respectively). The introduced DIA procedure provides a rapid and precise estimation of legume DM contribution for different legume species across a wide range of sward ages. Further research is needed in order to adapt the procedure to field scale, dealing with differing light effects and potentially higher swards. The integration of total biomass into the model for determining legume contribution does not necessarily reduce its applicability in practice as a combined estimation of total biomass and legume coverage by field spectroscopy (Biewer et al. 2009) and DIA, respectively, may allow for an accurate prediction of the legume contribution in legume-grass mixtures.
Resumo:
-Status report on June Executive Board commitments -Enterprise-level LESAT Beta Version -Detailed-level LESAT Development Plan -Industry and government participation and support requirements -Resource Needs -Executive Board decision on proposed next steps
Resumo:
The ISSA Pedagogical Standards were first published in 2001 as a network-developed tool that defined quality in teaching practices and the classroom environment and captured the changes that had occurred in the region since 1994 when the Step by Step Program, an initiative to promote democratic principles in early childhood development and education, was launched. The Program was built on belief that each child has the right to receive maximum support for the development of his or her full potential, and this work should be done in partnership and close cooperation with families, communities and professionals
Resumo:
The pedagogical and didactic dynamic system is focused on individual learning process and aims at the development of artistic knowledge, helping and guiding learners through different strategies or individual support, thus reinforcing the process. In consequence, this presentation looks for an alternative to the intercommunication student-teacher supported on the educational paradigm, through textual analyses of the daily diaries, developped by teacher and students, so as to discover successes or difficulties
Resumo:
El sistema de fangs activats és el tractament biològic més àmpliament utilitzat arreu del món per la depuració d'aigües residuals. El seu funcionament depèn de la correcta operació tant del reactor biològic com del decantador secundari. Quan la fase de sedimentació no es realitza correctament, la biomassa no decantada s'escapa amb l'efluent causant un impacte sobre el medi receptor. Els problemes de separació de sòlids, són actualment una de les principals causes d'ineficiència en l'operació dels sistemes de fangs activats arreu del món. Inclouen: bulking filamentós, bulking viscós, escumes biològiques, creixement dispers, flòcul pin-point i desnitrificació incontrolada. L'origen dels problemes de separació generalment es troba en un desequilibri entre les principals comunitats de microorganismes implicades en la sedimentació de la biomassa: els bacteris formadors de flòcul i els bacteris filamentosos. Degut a aquest origen microbiològic, la seva identificació i control no és una tasca fàcil pels caps de planta. Els Sistemes de Suport a la Presa de Decisions basats en el coneixement (KBDSS) són un grup d'eines informàtiques caracteritzades per la seva capacitat de representar coneixement heurístic i tractar grans quantitats de dades. L'objectiu de la present tesi és el desenvolupament i validació d'un KBDSS específicament dissenyat per donar suport als caps de planta en el control dels problemes de separació de sòlids d'orígen microbiològic en els sistemes de fangs activats. Per aconseguir aquest objectiu principal, el KBDSS ha de presentar les següents característiques: (1) la implementació del sistema ha de ser viable i realista per garantir el seu correcte funcionament; (2) el raonament del sistema ha de ser dinàmic i evolutiu per adaptar-se a les necessitats del domini al qual es vol aplicar i (3) el raonament del sistema ha de ser intel·ligent. En primer lloc, a fi de garantir la viabilitat del sistema, s'ha realitzat un estudi a petita escala (Catalunya) que ha permès determinar tant les variables més utilitzades per a la diagnosi i monitorització dels problemes i els mètodes de control més viables, com la detecció de les principals limitacions que el sistema hauria de resoldre. Els resultats d'anteriors aplicacions han demostrat que la principal limitació en el desenvolupament de KBDSSs és l'estructura de la base de coneixement (KB), on es representa tot el coneixement adquirit sobre el domini, juntament amb els processos de raonament a seguir. En el nostre cas, tenint en compte la dinàmica del domini, aquestes limitacions es podrien veure incrementades si aquest disseny no fos òptim. En aquest sentit, s'ha proposat el Domino Model com a eina per dissenyar conceptualment el sistema. Finalment, segons el darrer objectiu referent al seguiment d'un raonament intel·ligent, l'ús d'un Sistema Expert (basat en coneixement expert) i l'ús d'un Sistema de Raonament Basat en Casos (basat en l'experiència) han estat integrats com els principals sistemes intel·ligents encarregats de dur a terme el raonament del KBDSS. Als capítols 5 i 6 respectivament, es presenten el desenvolupament del Sistema Expert dinàmic (ES) i del Sistema de Raonament Basat en Casos temporal, anomenat Sistema de Raonament Basat en Episodis (EBRS). A continuació, al capítol 7, es presenten detalls de la implementació del sistema global (KBDSS) en l'entorn G2. Seguidament, al capítol 8, es mostren els resultats obtinguts durant els 11 mesos de validació del sistema, on aspectes com la precisió, capacitat i utilitat del sistema han estat validats tant experimentalment (prèviament a la implementació) com a partir de la seva implementació real a l'EDAR de Girona. Finalment, al capítol 9 s'enumeren les principals conclusions derivades de la present tesi.
Resumo:
This paper reviews a study to examine the feasibility of using elicited language samples as a basis for planning language instruction and as a measure of progress in language development.