119 resultados para Detectability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Localizar em subsuperfície a região que mais influencia nas medidas obtidas na superfície da Terra é um problema de grande relevância em qualquer área da Geofísica. Neste trabalho, é feito um estudo sobre a localização dessa região, denominada aqui zona principal, para métodos eletromagnéticos no domínio da freqüência, utilizando-se como fonte uma linha de corrente na superfície de um semi-espaço condutor. No modelo estudado, tem-se, no interior desse semi-espaço, uma heterogeneidade na forma de camada infinita, ou de prisma com seção reta quadrada e comprimento infinito, na direção da linha de corrente. A diferença entre a medida obtida sobre o semi-espaço contendo a heterogeneidade e aquela obtida sobre o semi-espaço homogêneo, depende, entre outros parâmetros, da localização da heterogeneidade em relação ao sistema transmissor-receptor. Portanto, mantidos constantes os demais parâmetros, existirá uma posição da heterogeneidade em que sua influência é máxima nas medidas obtidas. Como esta posição é dependente do contraste de condutividade, das dimensões da heterogeneidade e da freqüência da corrente no transmissor, fica caracterizada uma região e não apenas uma única posição em que a heterogeneidade produzirá a máxima influência nas medidas. Esta região foi denominada zona principal. Identificada a zona principal, torna-se possível localizar com precisão os corpos que, em subsuperfície, provocam as anomalias observadas. Trata-se geralmente de corpos condutores de interesse para algum fim determinado. A localização desses corpos na prospecção, além de facilitar a exploração, reduz os custos de produção. Para localizar a zona principal, foi definida uma função Detetabilidade (∆), capaz de medir a influência da heterogeneidade nas medidas. A função ∆ foi calculada para amplitude e fase das componentes tangencial (Hx) e normal (Hz) à superfície terrestre do campo magnético medido no receptor. Estudando os extremos da função ∆ sob variações de condutividade, tamanho e profundidade da heterogeneidade, em modelos unidimensionais e bidimensionais, foram obtidas as dimensões da zona principal, tanto lateralmente como em profundidade. Os campos eletromagnéticos em modelos unidimensionais foram obtidos de uma forma híbrida, resolvendo numericamente as integrais obtidas da formulação analítica. Para modelos bidimensionais, a solução foi obtida através da técnica de elementos finitos. Os valores máximos da função ∆, calculada para amplitude de Hx, mostraram-se os mais indicados para localizar a zona principal. A localização feita através desta grandeza apresentou-se mais estável do que através das demais, sob variação das propriedades físicas e dimensões geométricas, tanto dos modelos unidimensionais como dos bidimensionais. No caso da heterogeneidade condutora ser uma camada horizontal infinita (caso 1D), a profundidade do plano central dessa camada vem dada pela relação po = 0,17 δo, onde po é essa profundidade e δo o "skin depth" da onda plana (em um meio homogêneo de condutividade igual à do meio encaixante (σ1) e a freqüência dada pelo valor de w em que ocorre o máximo de ∆ calculada para a amplitude de Hx). No caso de uma heterogeneidade bidimensional (caso 2D), as coordenadas do eixo central da zona principal vem dadas por do = 0,77 r0 (sendo do a distância horizontal do eixo à fonte transmissora) e po = 0,36 δo (sendo po a profundidade do eixo central da zona principal), onde r0 é a distância transmissor-receptor e δo o "skin depth" da onda plana, nas mesmas condições já estipuladas no caso 1D. Conhecendo-se os valores de r0 e δo para os quais ocorre o máximo de ∆, calculado para a amplitude de Hx, pode-se determinar (do, po). Para localizar a zona principal (ou, equivalentemente, uma zona condutora anômala em subsuperfície), sugere-se um método que consiste em associar cada valor da função ∆ da amplitude de Hx a um ponto (d, p), gerado através das relações d = 0,77 r e p = 0,36 δ, para cada w, em todo o espectro de freqüências das medidas, em um dado conjunto de configurações transmissor-receptor. São, então, traçadas curvas de contorno com os isovalores de ∆ que vão convergir, na medida em que o valor de ∆ se aproxima do máximo, sobre a localização e as dimensões geométricas aproximadas da heterogeneidade (zona principal).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sparse arrays have pitch larger than half-wavelength (lambda/2) and there is a reduced number of elements in comparison with a full-populated array. Consequently, there is a reduction in cost, data acquisition and processing. However, conventional beamforming techniques result in large side and grating lobes, and consequently in image artifacts. In this work the instantaneous phase of the signals is used in a beamforming technique instead of the instantaneous amplitudes to improve images obtained from sparse arrays configurations. A threshold based on a statistical analysis and the number of signals used for imaging is applied to each pixel, in order to determine if that pixel is related to a defect or not. Three sets of data are used to evaluate the technique, considering medical and non-destructive testing: a simulated point spread function, a medical phantom and an aluminum plate with 2 lambda-, 7 lambda- and lambda-pitch, respectively. The conventional amplitude image is superposed by the image improved by the instantaneous phase, increasing the reflectors detectability and reducing artifacts for all cases, as well as dead zone for the tested plate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Infectious diseases can bring about population declines and local host extinctions, contributing significantly to the global biodiversity crisis. Nonetheless, studies measuring population-level effects of pathogens in wild host populations are rare, and taxonomically biased toward avian hosts and macroparasitic infections. We investigated the effects of bovine tuberculosis (bTB), caused by the bacterial pathogen Mycobacterium bovis, on African buffalo (Syncerus caffer) at Hluhluwe-iMfolozi Park, South Africa. We tested 1180 buffalo for bTB infection between May 2000 and November 2001. Most infections were mild, confirming the chronic nature of the disease in buffalo. However, our data indicate that bTB affects both adult survival and fecundity. Using an age-structured population model, we demonstrate that the pathogen can reduce population growth rate drastically; yet its effects appear difficult to detect at the population level: bTB causes no conspicuous mass mortalities or fast population declines, nor does it alter host-population age structure significantly. Our models suggest that this syndrome—low detectability coupled with severe impacts on population growth rate and, therefore, resilience—may be characteristic of chronic diseases in large mammals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determination of organic acids in intracellular extracts and in the cultivation media of marine microalgae aid investigations about metabolic routes related to assimilation of atmospheric carbon by these organisms, which are known by their role in the carbon dioxide sink. The separation of these acids was investigated by hydrophilic interaction liquid chromatography (HILIC) using isocratic elution with a mobile phase composed of 70: 30 v/v acetonitrile/20 mmol/L ammonium acetate buffer (pH 6.8) and detection at 220 nm. HILIC allowed the determinations of glycolic acid, the most important metabolite for the evaluation of the photorespiration process in algae, to be made with better selectivity than that achieved by reversed phase liquid chromatography, but with less detectability. The concentration of glycolic acid was determined in the cultivation media and in intracellular extracts of the algae Tetraselmis gracilis and Phaeodactylum tricornutum submitted to different conditions of aeration: (i) without forced aeration, (ii) aeration with atmospheric air, and (iii) bubbling with N(2). The concentration of glycolic acid had a higher increase as the cultures were aerated with nitrogen, showing higher photorespiratory flux than that occurring in the cultures aerated with atmospheric air.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within-site variability in species detectability is a problem common to many biodiversity assessments and can strongly bias the results. Such variability can be caused by many factors, including simple counting inaccuracies, which can be solved by increasing sample size, or by temporal changes in species behavior, meaning that the way the temporal sampling protocol is designed is also very important. Here we use the example of mist-netted tropical birds to determine how design decisions in the temporal sampling protocol can alter the data collected and how these changes might affect the detection of ecological patterns, such as the species-area relationship (SAR). Using data from almost 3400 birds captured from 21,000 net-hours at 31 sites in the Brazilian Atlantic Forest, we found that the magnitude of ecological trends remained fairly stable, but the probability of detecting statistically significant ecological patterns varied depending on sampling effort, time of day and season in which sampling was conducted. For example, more species were detected in the wet season, but the SAR was strongest in the dry season. We found that the temporal distribution of sampling effort was more important than its total amount, discovering that similar ecological results could have been obtained with one-third of the total effort, as long as each site had been equally sampled over 2 yr. We discuss that projects with the same sampling effort and spatial design, but with different temporal sampling protocol are likely to report different ecological patterns, which may ultimately lead to inappropriate conservation strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The evolutionary advantages of selective attention are unclear. Since the study of selective attention began, it has been suggested that the nervous system only processes the most relevant stimuli because of its limited capacity [1]. An alternative proposal is that action planning requires the inhibition of irrelevant stimuli, which forces the nervous system to limit its processing [2]. An evolutionary approach might provide additional clues to clarify the role of selective attention. Methods We developed Artificial Life simulations wherein animals were repeatedly presented two objects, "left" and "right", each of which could be "food" or "non-food." The animals' neural networks (multilayer perceptrons) had two input nodes, one for each object, and two output nodes to determine if the animal ate each of the objects. The neural networks also had a variable number of hidden nodes, which determined whether or not it had enough capacity to process both stimuli (Table 1). The evolutionary relevance of the left and the right food objects could also vary depending on how much the animal's fitness was increased when ingesting them (Table 1). We compared sensory processing in animals with or without limited capacity, which evolved in simulations in which the objects had the same or different relevances. Table 1. Nine sets of simulations were performed, varying the values of food objects and the number of hidden nodes in the neural networks. The values of left and right food were swapped during the second half of the simulations. Non-food objects were always worth -3. The evolution of neural networks was simulated by a simple genetic algorithm. Fitness was a function of the number of food and non-food objects each animal ate and the chromosomes determined the node biases and synaptic weights. During each simulation, 10 populations of 20 individuals each evolved in parallel for 20,000 generations, then the relevance of food objects was swapped and the simulation was run again for another 20,000 generations. The neural networks were evaluated by their ability to identify the two objects correctly. The detectability (d') for the left and the right objects was calculated using Signal Detection Theory [3]. Results and conclusion When both stimuli were equally relevant, networks with two hidden nodes only processed one stimulus and ignored the other. With four or eight hidden nodes, they could correctly identify both stimuli. When the stimuli had different relevances, the d' for the most relevant stimulus was higher than the d' for the least relevant stimulus, even when the networks had four or eight hidden nodes. We conclude that selection mechanisms arose in our simulations depending not only on the size of the neuron networks but also on the stimuli's relevance for action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The subject of this Ph.D. research thesis is the development and application of multiplexed analytical methods based on bioluminescent whole-cell biosensors. One of the main goals of analytical chemistry is multianalyte testing in which two or more analytes are measured simultaneously in a single assay. The advantages of multianalyte testing are work simplification, high throughput, and reduction in the overall cost per test. The availability of multiplexed portable analytical systems is of particular interest for on-field analysis of clinical, environmental or food samples as well as for the drug discovery process. To allow highly sensitive and selective analysis, these devices should combine biospecific molecular recognition with ultrasensitive detection systems. To address the current need for rapid, highly sensitive and inexpensive devices for obtaining more data from each sample,genetically engineered whole-cell biosensors as biospecific recognition element were combined with ultrasensitive bioluminescence detection techniques. Genetically engineered cell-based sensing systems were obtained by introducing into bacterial, yeast or mammalian cells a vector expressing a reporter protein whose expression is controlled by regulatory proteins and promoter sequences. The regulatory protein is able to recognize the presence of the analyte (e.g., compounds with hormone-like activity, heavy metals…) and to consequently activate the expression of the reporter protein that can be readily measured and directly related to the analyte bioavailable concentration in the sample. Bioluminescence represents the ideal detection principle for miniaturized analytical devices and multiplexed assays thanks to high detectability in small sample volumes allowing an accurate signal localization and quantification. In the first chapter of this dissertation is discussed the obtainment of improved bioluminescent proteins emitting at different wavelenghts, in term of increased thermostability, enhanced emission decay kinetic and spectral resolution. The second chapter is mainly focused on the use of these proteins in the development of whole-cell based assay with improved analytical performance. In particular since the main drawback of whole-cell biosensors is the high variability of their analyte specific response mainly caused by variations in cell viability due to aspecific effects of the sample’s matrix, an additional bioluminescent reporter has been introduced to correct the analytical response thus increasing the robustness of the bioassays. The feasibility of using a combination of two or more bioluminescent proteins for obtaining biosensors with internal signal correction or for the simultaneous detection of multiple analytes has been demonstrated by developing a dual reporter yeast based biosensor for androgenic activity measurement and a triple reporter mammalian cell-based biosensor for the simultaneous monitoring of two CYP450 enzymes activation, involved in cholesterol degradation, with the use of two spectrally resolved intracellular luciferases and a secreted luciferase as a control for cells viability. In the third chapter is presented the development of a portable multianalyte detection system. In order to develop a portable system that can be used also outside the laboratory environment even by non skilled personnel, cells have been immobilized into a new biocompatible and transparent polymeric matrix within a modified clear bottom black 384 -well microtiter plate to obtain a bioluminescent cell array. The cell array was placed in contact with a portable charge-coupled device (CCD) light sensor able to localize and quantify the luminescent signal produced by different bioluminescent whole-cell biosensors. This multiplexed biosensing platform containing whole-cell biosensors was successfully used to measure the overall toxicity of a given sample as well as to obtain dose response curves for heavy metals and to detect hormonal activity in clinical samples (PCT/IB2010/050625: “Portable device based on immobilized cells for the detection of analytes.” Michelini E, Roda A, Dolci LS, Mezzanotte L, Cevenini L , 2010). At the end of the dissertation some future development steps are also discussed in order to develop a point of care (POCT) device that combine portability, minimum sample pre-treatment and highly sensitive multiplexed assays in a short assay time. In this POCT perspective, field-flow fractionation (FFF) techniques, in particular gravitational variant (GrFFF) that exploit the earth gravitational field to structure the separation, have been investigated for cells fractionation, characterization and isolation. Thanks to the simplicity of its equipment, amenable to miniaturization, the GrFFF techniques appears to be particularly suited for its implementation in POCT devices and may be used as pre-analytical integrated module to be applied directly to drive target analytes of raw samples to the modules where biospecifc recognition reactions based on ultrasensitive bioluminescence detection occurs, providing an increase in overall analytical output.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die qualitative und quantitative Analyse von Biomolekülen hat in den letzten Jahren und Jahrzehnten immer mehr an Bedeutung gewonnen. Durch das Aufkommen und die kontinuierliche Weiterentwicklung neuer Separations- und Detektionsmethoden und deren Verbindung miteinander zu leistungsfähigen Einheiten, erlangte man Schritt für Schritt neue Erkenntnisse bei ihrer Untersuchung. Die Elementmassenspektrometrie als nachweisstarke Detektionsmethode wird von vielen wissenschaftlichen Arbeitsgruppen bei der Trennung und Quantifizierung von Proteinen und Metalloproteinen mittels Detektion der in den Biomolekülen vorkommenden Metalle und Heteroatome angewendet. Heteroatome (z.B. Schwefel, Phosphor) haben im Plasma des ICP-MS (inductively coupled plasma - mass spectrometer) schlechte Ionisationseigenschaften und dementsprechend deutlich höhere Nachweisgrenzen als Metalle. Ein Ansatz, schlecht oder nicht detektierbare Verbindungen (also solche, die keine Metalle oder Heteroatome enthalten) mit dem ICP-MS sichtbar zu machen, ist die Markierung der selbigen mit Metallionen oder -cluster. rnIn dieser Arbeit ist es gelungen, der Analyse ganz unterschiedlicher Substanzklassen, zum einen metallische Nanopartikel und zum anderen Proteine, neue Impulse zu geben und zukünftiges Potential bei der Anwendung gekoppelter Techniken zur Separation und Detektion aufzuzeigen. Durch die Verwendung einer alten, aber neu konzipierten Trenntechnik, der Gelelektrophorese (GE), und deren Kopplung an einen modernen Detektor, dem ICP-MS, kann die für die Proteinanalytik weit verbreitete Gelelektrophorese ihr enormes Potential bei der Trennung verschiedenster Verbindungsklassen mit der exzellenten Nachweisstärke und Elementspezifität des ICP-MS verbinden und dadurch mit deutlich weniger Arbeitsaufwand als bisher qualitative und auch quantitative Ergebnisse produzieren. Bisher war dies nur mit großem präparativem Aufwand unter Verwendung der laser ablation möglich. Bei der Analyse von Nanopartikeln konnte aufgezeigt werden, dass durch die GE-ICP-MS-Kopplung aufgrund der guten Trenneigenschaften der GE vorhandene Spezies bzw. Fraktionen voneinander separiert werden und mit Hilfe des ICP-MS Informationen auf atomarem Niveau gewonnen werden können. Es war möglich, das atomare Verhältnis der Metallatome im Kern und der Schwefelatome in der Ligandenhülle eines Nanopartikels zu bestimmen und damit die Größe des Partikels abzuschätzen. Auch konnte die Anzahl der Goldatome in einem dem Schmid-Cluster ähnlichen Nanopartikel bestimmt werden, was vorher nur mit Hilfe von MALDI-TOF möglich war. Bei der Analyse von Biomolekülen konnte auf einfache Weise der Phosphorylierungsgrad verschiedener Proteine bestimmt werden. Auch bei kleinen Molekülen erzielt die Gelelektrophorese ausgezeichnete Trennergebnisse, wie z. B. bei der Analyse verschiedener Brom- und Iodspezies.rnDie stöchiometrische Kopplung eines Proteins an einen Nanopartikel, ohne eine der beiden Verbindungen in einem größeren Maße zu verändern, stellte jedoch eine Herausforderung dar, die im Rahmen dieser Arbeit nicht vollständig gelöst werden konnte. Verschiedene Ansätze zur Kopplung der beiden Substanzen wurden erprobt, jedoch führte keine zu dem gewünschten Ergebnis einer stöchiometrisch vollständigen und spezifischen Modifikation eines Proteins mit einem Nanopartikel. Durch das Potential der GE-ICP-MS-Kopplung bei der Analyse beider Substanz-klassen und dem Beweis der Praktikabilität und Zuverlässigkeit der Methode ist jedoch der Grundstein für weitere Forschungen auf diesem Gebiet gelegt worden. Ist eine geeignete chemische Kopplung der beiden Substanzklassen gefunden und beherrscht, steht auf analytischer Seite eine leistungsstarke Kombination aus Trennung und Detektion zur Verfügung, um die Quantifizierung von Proteinen entscheidend zu verbessern.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il presente lavoro di tesi è stato svolto presso il servizio di Fisica Sanitaria del Policlinico Sant'Orsola-Malpighi di Bologna. Lo studio si è concentrato sul confronto tra le tecniche di ricostruzione standard (Filtered Back Projection, FBP) e quelle iterative in Tomografia Computerizzata. Il lavoro è stato diviso in due parti: nella prima è stata analizzata la qualità delle immagini acquisite con una CT multislice (iCT 128, sistema Philips) utilizzando sia l'algoritmo FBP sia quello iterativo (nel nostro caso iDose4). Per valutare la qualità delle immagini sono stati analizzati i seguenti parametri: il Noise Power Spectrum (NPS), la Modulation Transfer Function (MTF) e il rapporto contrasto-rumore (CNR). Le prime due grandezze sono state studiate effettuando misure su un fantoccio fornito dalla ditta costruttrice, che simulava la parte body e la parte head, con due cilindri di 32 e 20 cm rispettivamente. Le misure confermano la riduzione del rumore ma in maniera differente per i diversi filtri di convoluzione utilizzati. Lo studio dell'MTF invece ha rivelato che l'utilizzo delle tecniche standard e iterative non cambia la risoluzione spaziale; infatti gli andamenti ottenuti sono perfettamente identici (a parte le differenze intrinseche nei filtri di convoluzione), a differenza di quanto dichiarato dalla ditta. Per l'analisi del CNR sono stati utilizzati due fantocci; il primo, chiamato Catphan 600 è il fantoccio utilizzato per caratterizzare i sistemi CT. Il secondo, chiamato Cirs 061 ha al suo interno degli inserti che simulano la presenza di lesioni con densità tipiche del distretto addominale. Lo studio effettuato ha evidenziato che, per entrambi i fantocci, il rapporto contrasto-rumore aumenta se si utilizza la tecnica di ricostruzione iterativa. La seconda parte del lavoro di tesi è stata quella di effettuare una valutazione della riduzione della dose prendendo in considerazione diversi protocolli utilizzati nella pratica clinica, si sono analizzati un alto numero di esami e si sono calcolati i valori medi di CTDI e DLP su un campione di esame con FBP e con iDose4. I risultati mostrano che i valori ricavati con l'utilizzo dell'algoritmo iterativo sono al di sotto dei valori DLR nazionali di riferimento e di quelli che non usano i sistemi iterativi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the present study was to evaluate the detectability and dimensions of periapical lesions, the relationship of the mandibular canal to the roots of the respective teeth, and the dimension of the buccal bone by using limited cone-beam computed tomography (CBCT) in comparison to conventional periapical (PA) radiographs for evaluation of mandibular molars before apical surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neospora caninum represents one of the most frequent abortifaciant organisms worldwide. The parasite is diaplacentally transmitted from the pregnant cow to the fetus, where it normally leads to the delivery of a healthy, however persistently infected calf. Abortion thus is a relative rare event. The transmission of bovine neosporosis occurs in more than 90% of the cases vertically due to the endogenous reactivation of a persistently infected mother. Exogenous infections are therefore responsible for less than 10% of the cases.The question arises about which infection sources may be relevant in this context. In Switzerland, the role of dogs as definitive hosts has been shown to be of low significance in that respect. Recently, discussion focused on the potential of infectious bull semen following natural or artificial insemination. Thus, a few years ago a report documented the detectability of N. caninum-DNA in the semen of naturally infected bulls by nested-PCR. As a consequence, we decided to gain own experience by investigating 5 separate semen specimens per animal, originating from 20 N. caninum-seropositive bulls used for artificial insemination in Switzerland. All probes turned out to be negative by nested PCR. Based upon our laboratory experiences, the potential bull semen-associated Neospora-problem seems not to affect the Swiss bull population, thus there is no evidence to include further respective means of control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Triggered event-related functional magnetic resonance imaging requires sparse intervals of temporally resolved functional data acquisitions, whose initiation corresponds to the occurrence of an event, typically an epileptic spike in the electroencephalographic trace. However, conventional fMRI time series are greatly affected by non-steady-state magnetization effects, which obscure initial blood oxygen level-dependent (BOLD) signals. Here, conventional echo-planar imaging and a post-processing solution based on principal component analysis were employed to remove the dominant eigenimages of the time series, to filter out the global signal changes induced by magnetization decay and to recover BOLD signals starting with the first functional volume. This approach was compared with a physical solution using radiofrequency preparation, which nullifies magnetization effects. As an application of the method, the detectability of the initial transient BOLD response in the auditory cortex, which is elicited by the onset of acoustic scanner noise, was used to demonstrate that post-processing-based removal of magnetization effects allows to detect brain activity patterns identical with those obtained using the radiofrequency preparation. Using the auditory responses as an ideal experimental model of triggered brain activity, our results suggest that reducing the initial magnetization effects by removing a few principal components from fMRI data may be potentially useful in the analysis of triggered event-related echo-planar time series. The implications of this study are discussed with special caution to remaining technical limitations and the additional neurophysiological issues of the triggered acquisition.