973 resultados para Vector spaces -- Problems, exercises, etc.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study reports the embryogenesis of T. infestans (Hemiptera, Reduviidae). Morphological parameters of growth sequences from oviposition until hatching (12-14 d 28ºC) were established. Five periods, as percent of time of development (TD), were characterized from oviposition until hatching. The most important morphological features were: 1) formation of blastoderm within 7% of TD; 2) germ band and gastrulation within 30% of TD; 3) nerve cord, limb budding, thoracic and abdominal segmentation and formation of body cavity within 50% of TD; 4) nervous system and blastokinesis end, and development of embryonic cuticle within 65% of TD; 5) differentiation of the mouth parts, fat body, and Malphigian tubules during final stage and completion of embryo at day 12 to day 14 around hatching. These signals were chosen as appropriate morphological parameters which should enable the evaluation of embryologic modifications due to the action/s of different insecticides

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lethal effect of a bait containing an aqueous hexachlorocyclohexane (HCH) suspension at the concentration of 1g/l and maintained at room temperature was studied in the laboratory over a period of 12 weeks. The suspension was placed in a latex bag hanging inside a 1000-ml beaker tightly covered with nylon netting, and left there with no changes for 85 days. Sixteen groups of R. prolixas bugs, consisting on average of 30 specimens each, were successively exposed to the bait and observed at different intervals for one week each. The mortality rate was 100% for all groups, except for the 16th, whose mortality rate was 96.7%. As the groups succeeded one another, mortality started to occur more rapidly and was more marked at the 6- and 24-h intervals. Later tests respectively started at 6:00 a.m. and 6:00 p.m. showed that diurnal and nocturnal periodicity in the offer of food had no effect on mortality. First- and 2nd- instar nymphs and adults male were more sensitive and 5th- instar nymphs were more resistant to the active principle of the bait.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lutzomyia verrucarum (Townsend, 1913) (Diptera: Psychodidae), vector natural de la verruga peruana o enfermedad de Carrión es una especie propia del Perú. Su distribución geográfica esta entre los paralelos 5º y 13º25' de latitud Sur, se encuentra en los valles Occidentales e Interandinos de los Andes. La distribución altitudinal de Lu. verrucarum en los diversos valles es variable; asi: Occidentales, desde 1100 hasta 2980 msnm e Interandinos, de 1200 a 3200 msnm. En ciertas áreas verrucógenas no hay correlación entre la presencia de Lu. verrucarum y la enfermedad de Carrión lo que suguiere la existencia de vectores secundarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arguably, the most difficult task in text classification is to choose an appropriate set of features that allows machine learning algorithms to provide accurate classification. Most state-of-the-art techniques for this task involve careful feature engineering and a pre-processing stage, which may be too expensive in the emerging context of massive collections of electronic texts. In this paper, we propose efficient methods for text classification based on information-theoretic dissimilarity measures, which are used to define dissimilarity-based representations. These methods dispense with any feature design or engineering, by mapping texts into a feature space using universal dissimilarity measures; in this space, classical classifiers (e.g. nearest neighbor or support vector machines) can then be used. The reported experimental evaluation of the proposed methods, on sentiment polarity analysis and authorship attribution problems, reveals that it approximates, sometimes even outperforms previous state-of-the-art techniques, despite being much simpler, in the sense that they do not require any text pre-processing or feature engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de Projeto

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy systems worldwide are complex and challenging environments. Multi-agent based simulation platforms are increasing at a high rate, as they show to be a good option to study many issues related to these systems, as well as the involved players at act in this domain. In this scope the authors’ research group has developed a multi-agent system: MASCEM (Multi- Agent System for Competitive Electricity Markets), which simulates the electricity markets environment. MASCEM is integrated with ALBidS (Adaptive Learning Strategic Bidding System) that works as a decision support system for market players. The ALBidS system allows MASCEM market negotiating players to take the best possible advantages from the market context. This paper presents the application of a Support Vector Machines (SVM) based approach to provide decision support to electricity market players. This strategy is tested and validated by being included in ALBidS and then compared with the application of an Artificial Neural Network, originating promising results. The proposed approach is tested and validated using real electricity markets data from MIBEL - Iberian market operator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents several forecasting methodologies based on the application of Artificial Neural Networks (ANN) and Support Vector Machines (SVM), directed to the prediction of the solar radiance intensity. The methodologies differ from each other by using different information in the training of the methods, i.e, different environmental complementary fields such as the wind speed, temperature, and humidity. Additionally, different ways of considering the data series information have been considered. Sensitivity testing has been performed on all methodologies in order to achieve the best parameterizations for the proposed approaches. Results show that the SVM approach using the exponential Radial Basis Function (eRBF) is capable of achieving the best forecasting results, and in half execution time of the ANN based approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind speed forecasting has been becoming an important field of research to support the electricity industry mainly due to the increasing use of distributed energy sources, largely based on renewable sources. This type of electricity generation is highly dependent on the weather conditions variability, particularly the variability of the wind speed. Therefore, accurate wind power forecasting models are required to the operation and planning of wind plants and power systems. A Support Vector Machines (SVM) model for short-term wind speed is proposed and its performance is evaluated and compared with several artificial neural network (ANN) based approaches. A case study based on a real database regarding 3 years for predicting wind speed at 5 minutes intervals is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprise and Work Innovation Studies,6,IET, pp.9-51

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low back problems are associated with decreased quality of life. Specific exercises can improve quality of life, resulting in better professional performance and functionality. The purpose of this study was to evaluate the effect of following a 21-month exercise program on the quality of life of warehouse workers. The population included 557 male warehouse workers from a food distribution company in Oporto, Portugal. Upon application of the selection criteria, 249 workers were deemed eligible, which were randomized into two groups (125 in the intervention group and 124 in the control group). Then, subjects were asked to volunteer for the study, the sample being formed by 229 workers (112 in the intervention group and 117 in the control group). All subjects completed the SF-36 questionnaire prior to beginning the program and on the 11th and 21st months following it. The exercises were executed in the company facilities once a day for 8 min. Data were analyzed using SPSS® 17.0 for Windows®. After 11 months of following the exercise program, there was an increase in all scores for the experimental group, with statistically significant differences in the dimensions physical functioning (0.019), bodily pain (0.010), general health (0.004), and rolephysical (0.037). The results obtained at the end of the study (21 months) showed significant improvements in the dimensions physical functioning (p = 0.002), rolephysical (p = 0.007), bodily pain (p = 0.001), social functioning (p = 0.015), role-emotional (p = 0.011), and mental health (p = 0.001). In the control group all dimensions showed a decrease in mean scores. It can be concluded that the implementation of a low back specific exercise program has changed positively the quality of life of warehouse workers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A satisfação das necessidades energéticas mundiais, cada vez mais exigentes, bem como a necessidade urgente de procurar caminhos que permitam usufruir de energia, da forma menos poluente possível, levam à necessidade de serem explorados caminhos que permitam cumprir estes pressupostos. A escolha da utilização das energias renováveis na produção de energia, torna-se cada vez mais interessante, quer do ponto de vista ambiental quer económico. O fundamento da lógica difusa está associado à recolha de informações vagas, que são no fundo uma linguagem falada por seres humanos, possibilitando a passagem deste tipo de linguagem para formato numérico, permitindo assim uma manipulação computacional. Elementos climáticos como o sol e o vento, podem ser descritos em forma de variáveis linguísticas, como é o caso de vento forte, temperatura baixa, irradiação fraca, etc. Isto faz com que a aplicação de um controlo a partir destes fenómenos, justifique ser realizado com recurso a sistemas de inferência difusa. Para a realização do trabalho proposto, foram consumados estudos relativos às energias renováveis, com particular enfoque na solar e na eólica. Também foi realizado um estudo dos conceitos pertencentes à lógica difusa e a sistemas de inferência difusa com o objetivo de perceber os diversos parâmetros constituintes desta matéria. Foi realizado o estudo e desenvolvimento de um sistema de aquisição de dados, bem como do controlador difuso que é o busílis do trabalho descrito neste relatório. Para tal, o trabalho foi efetuado com o recurso ao software MATLAB, a partir do qual foram desenvolvidas aplicações que possibilitaram a obtenção de dados climáticos, com vista à sua utilização na toolbox Fuzzy Logic a qual foi utilizada para o desenvolvimento de todo o algoritmo de controlo. Com a possibilidade de aquisição de dados concluída e das variáveis que iriam ser necessárias definidas, foi implementado o controlador difuso que foi sendo sintonizado ao longo do trabalho por forma a garantir os melhores resultados possíveis. Com o recurso à ferramenta Guide, também do MATLAB, foi criada a interface do sistema com o utilizador, sendo possível a averiguação da energia a ser produzida, bem como das contribuições de cada uma das fontes de energia renováveis para a obtenção dessa mesma energia. Por último, foi feita uma análise de resultados através da comparação entre os valores reais esperados e os valores obtidos pelo controlador difuso, bem como assinaladas conclusões e possibilidades de desenvolvimentos futuros deste trabalho.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A malária, doença parasitária complexa que resulta da interacção entre parasita, hospedeiros humano e vector, constitui um dos principais problemas de saúde a nível mundial. À semelhança de outras doenças parasitárias e infecciosas a malária tem um papel importante na evolução, tendo já sido demonstrado o papel da variação genética humana na resistência à infecção. Após quase meio século de controlo, a malária persiste na ilha de Santiago onde, apesar da baixa endemicidade, os indivíduos apresentam geralmente manifestações moderadas, são diagnosticadas infecções abaixo do nível detectável pela microscopia e o vector se encontra muito próximo da população supostamente susceptível, desconhecendo-se a frequência dos principais polimorfismos genéticos humanos mais relacionados com a doença e a estrutura populacional do mosquito vector. Os objectivos gerais de trabalho desta tese assentam 1) no estudo dos dois clássicos factores genéticos do hospedeiro humano relacionados com a malária, nomeadamente os afectos à anemia das células falciformes, à deficiência em G6PD e a análise dum provável envolvimento da PK e 2) na análise genética das populações do mosquito vector, tentando contribuir para a compreensão da epidemiologia da doença na Ilha, e para a escolha de medidas de controlo apropriadas. Os trabalhos incidiram na detecção do alelo responsável pela hemoglobina S, de polimorfismos no gene da G6PD e da PK em indivíduos não aparentados (Infectados e não Infectados) com análise da sua provável associação com a infecção e, ainda, na genotipagem de loci microssatélites de Anopheles arabiensis com recurso a técnicas baseadas na PCR. Relativamente à anemia falciforme, a frequência dos portadores do traço (indivíduos HbAS) e do alelo HbS foi 6% e 5%, respectivamente, e para as variantes da G6PD, 0,8% para G6PDA- e 0,0% para a G6PDMed, não tendo sido encontrado associação entre os genótipos desses dois factores e a presença de infecção. No que concerne ao gene PKLR não foi encontrada uma associação clara entre os polimorfismos analisados e o estado de infecção, mas foi detectado um acentuado desequilíbrio de linkage entre os loci, apenas nos Não Infectados, o que pode significar que essa região do gene, aparentemente conservada, tenha sido seleccionada por fornecer protecção contra a infecção e/ou doença. A diversidade genética das populações de A. arabiensis em onze loci microssatélites foi moderada com valores médio de He, variando de 0,481 a 0,522 e a Rs de 4 a 5. O valor da diferenciação genética baseado em 7 loci polimórficos foi baixo (FST=0,012; p<0,001) mas significativo, variando entre 0,001 e 0,023 entre os pares de populações. Não foram detectados os alelos de resistência associados ao gene Kdr. A baixa frequência dos alelos associados à G6PD (A- e Med) tem implicações importantes nas estratégias de controlo definidas pelo Programa Nacional de Luta contra o Paludismo (PNLP), uma vez que a primaquina pode continuar a ser administrada como complemento aos regimes terapêuticos, em caso de necessidade. A população de A. arabiensis em Santiago revelou-se relativamente homogénea e com uma estrutura reduzida o que pode, por um lado, representar uma desvantagem por permitir uma provável dispersão dos genes de resistência. Por outro lado, essa relativa homogeneidade poderá representar uma vantagem para a introdução de um programa de controlo baseado na libertação de mosquitos transgénicos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atualmente, os sistemas de informação hospitalares têm de possibilitar uma utilização diferenciada pelos diferentes intervenientes, num cenário de constante adaptação e evolução. Para tal, é essencial a interoperabilidade entre os sistemas de informação do hospital e os diversos fornecedores de serviços, assim como dispositivos hospitalares. Apesar da necessidade de suportar uma heterogeneidade entre sistemas ser fundamental, o acesso/troca de informação deve ser feito de uma forma protocolada, segura e transparente. A infraestrutura de informação médica moderna consiste em muitos sistemas heterogéneos, com diversos mecanismos para controlar os dados subjacentes. Informações relativas a um único paciente podem estar dispersas por vários sistemas (ex: transferência de pacientes, readmissão, múltiplos tratamentos, etc.). Torna-se evidente a necessidade aceder a dados do paciente de forma consolidada a partir de diferentes locais. Desta forma, é fundamental utilizar uma arquitetura que promova a interoperabilidade entre sistemas. Para conseguir esta interoperabilidade, podem-se implementar camadas de “middleware” que façam a adaptação das trocas de informação entre os sistemas. Todavia, não resolvemos o problema subjacente, ou seja, a necessidade de utilização de um standard para garantir uma interacção fiável entre cliente/fornecedor. Para tal, é proposto uma solução que passa por um ESB dedicado para a área da saúde, denominada por HSB (Healthcare Service Bus). Entre as normas mais usuais nesta área devem-se salientar o HL7 e DICOM, esta última mais especificamente para dispositivos de imagem hospitalar, sendo a primeira utilizada para gestão e trocas de informação médica entre sistemas. O caso de estudo que serviu de base a esta dissertação é o de um hospital de média dimensão cujo sistema de informação começou por ser uma solução monolítica, de um só fornecedor. Com o passar dos anos, o fornecedor único desagregou-se em vários, independentes e concorrentes, dando lugar a um cenário extremamente preocupante em termos de manutenção e evolução futura do sistema de informação existente. Como resultado do trabalho efetuado, foi proposta uma arquitetura que permite a evolução do sistema atual de forma progressiva para um HSB puro.