960 resultados para Graph partitioning


Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: Analisar a acurácia do diagnóstico de dois protocolos de imunofluorescência indireta para leishmaniose visceral canina. MÉTODOS: Cães provenientes de inquérito soroepidemiológico realizado em área endêmica nos municípios de Araçatuba e de Andradina, na região noroeste do estado de São Paulo, em 2003, e área não endêmica da região metropolitana de São Paulo, foram utilizados para avaliar comparativamente dois protocolos da reação de imunofluorescência indireta (RIFI) para leishmaniose: um utilizando antígeno heterólogo Leishmania major (RIFI-BM) e outro utilizando antígeno homólogo Leishmania chagasi (RIFI-CH). Para estimar acurácia utilizou-se a análise two-graph receiver operating characteristic (TG-ROC). A análise TG-ROC comparou as leituras da diluição 1:20 do antígeno homólogo (RIFI-CH), consideradas como teste referência, com as diluições da RIFI-BM (antígeno heterólogo). RESULTADOS: A diluição 1:20 do teste RIFI-CH apresentou o melhor coeficiente de contingência (0,755) e a maior força de associação entre as duas variáveis estudadas (qui-quadrado=124,3), sendo considerada a diluição-referência do teste nas comparações com as diferentes diluições do teste RIFI-BM. Os melhores resultados do RIFI-BM foram obtidos na diluição 1:40, com melhor coeficiente de contingência (0,680) e maior força de associação (qui-quadrado=80,8). Com a mudança do ponto de corte sugerido nesta análise para a diluição 1:40 da RIFI-BM, o valor do parâmetro especificidade aumentou de 57,5% para 97,7%, embora a diluição 1:80 tivesse apresentado a melhor estimativa para sensibilidade (80,2%) com o novo ponto de corte. CONCLUSÕES: A análise TG-ROC pode fornecer importantes informações sobre os testes de diagnósticos, além de apresentar sugestões sobre pontos de cortes que podem melhorar as estimativas de sensibilidade e especificidade do teste, e avaliá-los a luz do melhor custo-benefício.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Perceber a rede estrutural formada pelos neurónios no cérebro a nível da macro escala é um desafio atual na área das neurociências. Neste estudo analisou-se a conectividade estrutural do cérebro em 22 indivíduos saudáveis e em dois doentes com epilepsia pós-traumática. Avaliaram-se as diferenças entre estes dois grupos. Também se pesquisaram diferenças a nível do género e idade no grupo de indivíduos saudáveis e os que têm valores médios mais elevados nas métricas de caracterização da rede. Para tal, desenvolveu-se um protocolo de análise recorrendo a diversos softwares especializados e usaram-se métricas da Teoria dos Grafos para a caracterização da conectividade estrutural entre 118 regiões encefálicas distintas. Dentro do grupo dos indivíduos saudáveis concluiu-se que os homens, no geral, são os que têm média mais alta para as métricas de caracterização da rede estrutural. Contudo, não se observaram diferenças significativas em relação ao género nas métricas de caracterização global do cérebro. Relativamente à idade, esta correlaciona-se negativamente, no geral, com as métricas de caracterização da rede estrutural. As regiões onde se observaram as diferenças mais importantes entre indivíduos saudáveis e doentes são: o sulco rolândico, o hipocampo, o pré-cuneus, o tálamo e o cerebelo bilateralmente. Estas diferenças são consistentes com as imagens radiológicas dos doentes e com a literatura estudada sobre a epilepsia pós-traumática. Preveem-se desenvolvimentos para o estudo da conectividade estrutural do cérebro humano, uma vez que a sua potencialidade pode ser combinada com outros métodos de modo a caracterizar as alterações dos circuitos cerebrais.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seven pyrethroids (bifenthrin, fenpropathrin, k-cyhalothrin, permethrin, a-cypermethrin, fenvalerate, and deltamethrin) were extracted from water using C18 solid-phase extraction disks, followed by gas chromatography with an electron capture detector (GC-ECD) analysis. The limits of detection in water samples ranged from 0.5 ng L-1 (fenpropathrin) to 110 ng L- 1 (permethrin), applying the calibration graph. The effects of different numbers of (re)utilizations of the same disks (up to four times with several concentrations) on the recoveries of the pyrethroids were considered. The recoveries were all between 70 and 120% after four utilizations of the same disk. There was no difference between these recoveries at a confidence level of 95%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O principal objectivo da animação de personagens virtuais é o de contar uma história através da utilização de personagens virtuais emocionalmente expressivos. Os personagens têm personalidades distintas, e transmitem as suas emoções e processos de pensamento através dos seus comportamentos (comunicação não verbal). As suas acções muitas das vezes constituem a geração de movimentos corporais complexos. Existem diversas questões a considerar quando se anima uma entidade complexa, tais como, a posição das zonas móveis e as suas velocidades. Os personagens virtuais são um exemplo de entidades complexas e estão entre os elementos mais utilizados em animação computacional. O foco desta dissertação consistiu na criação de uma proposta de sistema de animação de personagens virtuais, cujos movimentos e expressões faciais são capazes de transmitir emoções e estados de espírito. Os movimentos primários, ou seja os movimentos que definem o comportamento dos personagens, são provenientes da captura de movimentos humanos (Motion Capture). As animações secundárias, tais como as expressões faciais, são criadas em Autodesk Maya recorrendo à técnica BlendShapes. Os dados obtidos pela captura de movimentos, são organizados numa biblioteca de comportamentos através de um grafo de movimentos, conhecido por Move Tree. Esta estrutura permite o controlo em tempo real dos personagens através da gestão do estado dos personagens. O sistema possibilita também a transição eficaz entre movimentos semelhantes e entre diferentes velocidades de locomoção, minimizando o efeito de arrastamento de pés conhecido como footskate. Torna-se assim possível definir um trajecto que o personagem poderá seguir com movimentos suaves. Estão também disponíveis os resultados obtidos nas sessões de avaliação realizadas, que visaram a determinação da qualidade das transições entre animações. Propõem-se ainda o melhoramento do sistema através da implementação da construção automática do grafo de movimentos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In life cycle impact assessment (LCIA) models, the sorption of the ionic fraction of dissociating organic chemicals is not adequately modeled because conventional non-polar partitioning models are applied. Therefore, high uncertainties are expected when modeling the mobility, as well as the bioavailability for uptake by exposed biota and degradation, of dissociating organic chemicals. Alternative regressions that account for the ionized fraction of a molecule to estimate fate parameters were applied to the USEtox model. The most sensitive model parameters in the estimation of ecotoxicological characterization factors (CFs) of micropollutants were evaluated by Monte Carlo analysis in both the default USEtox model and the alternative approach. Negligible differences of CFs values and 95% confidence limits between the two approaches were estimated for direct emissions to the freshwater compartment; however the default USEtox model overestimates CFs and the 95% confidence limits of basic compounds up to three orders and four orders of magnitude, respectively, relatively to the alternative approach for emissions to the agricultural soil compartment. For three emission scenarios, LCIA results show that the default USEtox model overestimates freshwater ecotoxicity impacts for the emission scenarios to agricultural soil by one order of magnitude, and larger confidence limits were estimated, relatively to the alternative approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polyaromatic hydrocarbon (PAH) sorption to soil is a key process deciding the transport and fate of PAH, and potential toxic impacts in the soil and groundwater ecosystems, for example in connection with atmospheric PAH deposition on soils. There are numerous studies on PAH sorption in relatively low organic porous media such as urban soils and groundwater sediments, but less attention has been given to cultivated soils. In this study, the phenanthrene partition coefficient, KD (liter per kilogram), was measured on 143 cultivated Danish soils (115 topsoils, 0–0.25-m soil depth and 28 subsoils, 0.25–1-m depth) by the single-point adsorption method. The organic carbon partition coefficient, KOC (liter per kilogram) for topsoils was found generally to fall between the KOC values estimated by the two most frequently used models for PAH partitioning, the Abdul et al. (Hazardous Waste & Hazardous Materials 4(3):211– 222, 1987) model and Karickhoff et al. (Water Research 13:241–248, 1979) model. A less-recognized model by Karickhoff (Chemosphere 10:833–846, 1981), yielding a KOC of 14,918 Lkg−1, closely corresponded to the average measured KOC value for the topsoils, and this model is therefore recommended for prediction of phenanthrene mobility in cultivated topsoils. For lower subsoils (0.25–1-m depth), the KOC values were closer to and mostly below the estimate by the Abdul et al. (Hazardous Waste & Hazardous Materials 4(3):211–222, 1987) model. This implies a different organic matter composition and higher PAH sorption strength in cultivated topsoils, likely due to management effects including more rapid carbon turnover. Finally, we applied the recent Dexter et al. (Geoderma 144:620–627, 2008) theorem, and calculated the complexed organic carbon and non-complexed organic carbon fractions (COC and NCOC, grams per gram). Multiple regression analyses showed that the NCOC-based phenanthrene partition coefficient (KNCOC) could be markedly higher than the COCbased partition coefficient (KCOC) for soils with a clay/OC ratio <10. This possibly higher PAH sorption affinity to the NCOC fraction needs further investigations to develop more realistic and accurate models for PAH mobility and effects in the environment, also with regard to colloid-facilitated PAH transport.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study modeled the impact on freshwater ecosystems of pharmaceuticals detected in biosolids following application on agricultural soils. The detected sulfonamides and hydrochlorothiazide displayed comparatively moderate retention in solid matrices and, therefore, higher transfer fractions from biosolids to the freshwater compartment. However, the residence times of these pharmaceuticals in freshwater were estimated to be short due to abiotic degradation processes. The non-steroidal anti-inflammatory mefenamic acid had the highest environmental impact on aquatic ecosystems and warrants further investigation. The estimation of the solid-water partitioning coefficient was generally the most influential parameter of the probabilistic comparative impact assessment. These results and the modeling approach used in this study serve to prioritize pharmaceuticals in the research effort to assess the risks and the environmental impacts on aquatic biota of these emerging pollutants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Intervenção Sócio-Organizacional na Saúde - Ramo de especialização: Qualidade e Tecnologias da Saúde

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past years, Software Architecture has attracted increased attention by academia and industry as the unifying concept to structure the design of complex systems. One particular research area deals with the possibility of reconfiguring architectures to adapt the systems they describe to new requirements. Reconfiguration amounts to adding and removing components and connections, and may have to occur without stopping the execution of the system being reconfigured. This work contributes to the formal description of such a process. Taking as a premise that a single formalism hardly ever satisfies all requirements in every situation, we present three approaches, each one with its own assumptions about the systems it can be applied to and with different advantages and disadvantages. Each approach is based on work of other researchers and has the aesthetic concern of changing as little as possible the original formalism, keeping its spirit. The first approach shows how a given reconfiguration can be specified in the same manner as the system it is applied to and in a way to be efficiently executed. The second approach explores the Chemical Abstract Machine, a formalism for rewriting multisets of terms, to describe architectures, computations, and reconfigurations in a uniform way. The last approach uses a UNITY-like parallel programming design language to describe computations, represents architectures by diagrams in the sense of Category Theory, and specifies reconfigurations by graph transformation rules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Darwinian Particle Swarm Optimization (DPSO) is an evolutionary algorithm that extends the Particle Swarm Optimization using natural selection to enhance the ability to escape from sub-optimal solutions. An extension of the DPSO to multi-robot applications has been recently proposed and denoted as Robotic Darwinian PSO (RDPSO), benefiting from the dynamical partitioning of the whole population of robots, hence decreasing the amount of required information exchange among robots. This paper further extends the previously proposed algorithm adapting the behavior of robots based on a set of context-based evaluation metrics. Those metrics are then used as inputs of a fuzzy system so as to systematically adjust the RDPSO parameters (i.e., outputs of the fuzzy system), thus improving its convergence rate, susceptibility to obstacles and communication constraints. The adapted RDPSO is evaluated in groups of physical robots, being further explored using larger populations of simulated mobile robots within a larger scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider the problem of scheduling a set of sporadic tasks on a multiprocessor system to meet deadlines using a task-splitting scheduling algorithm. Task-splitting (also called semi-partitioning) scheduling algorithms assign most tasks to just one processor but a few tasks are assigned to two or more processors, and they are dispatched in a way that ensures that a task never executes on two or more processors simultaneously. A particular type of task-splitting algorithms, called slot-based task-splitting dispatching, is of particular interest because of its ability to schedule tasks with high processor utilizations. Unfortunately, no slot-based task-splitting algorithm has been implemented in a real operating system so far. In this paper we discuss and propose some modifications to the slot-based task-splitting algorithm driven by implementation concerns, and we report the first implementation of this family of algorithms in a real operating system running Linux kernel version 2.6.34. We have also conducted an extensive range of experiments on a 4-core multicore desktop PC running task-sets with utilizations of up to 88%. The results show that the behavior of our implementation is in line with the theoretical framework behind it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies static-priority preemptive scheduling on a multiprocessor using partitioned scheduling. We propose a new scheduling algorithm and prove that if the proposed algorithm is used and if less than 50% of the capacity is requested then all deadlines are met. It is known that for every static-priority multiprocessor scheduling algorithm, there is a task set that misses a deadline although the requested capacity is arbitrary close to 50%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática