948 resultados para LOW-RESOLUTION STRUCTURES


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The St. Lawrence Island polynya (SLIP) is a commonly occurring winter phenomenon in the Bering Sea, in which dense saline water produced during new ice formation is thought to flow northward through the Bering Strait to help maintain the Arctic Ocean halocline. Winter darkness and inclement weather conditions have made continuous in situ and remote observation of this polynya difficult. However, imagery acquired from the European Space Agency ERS-1 Synthetic Aperture Radar (SAR) has allowed observation of the St. Lawrence Island polynya using both the imagery and derived ice displacement products. With the development of ARCSyM, a high resolution regional model of the Arctic atmosphere/sea ice system, simulation of the SLIP in a climate model is now possible. Intercomparisons between remotely sensed products and simulations can lead to additional insight into the SLIP formation process. Low resolution SAR, SSM/I and AVHRR infrared imagery for the St. Lawrence Island region are compared with the results of a model simulation for the period of 24-27 February 1992. The imagery illustrates a polynya event (polynya opening). With the northerly winds strong and consistent over several days, the coupled model captures the SLIP event with moderate accuracy. However, the introduction of a stability dependent atmosphere-ice drag coefficient, which allows feedbacks between atmospheric stability, open water, and air-ice drag, produces a more accurate simulation of the SLIP in comparison to satellite imagery. Model experiments show that the polynya event is forced primarily by changes in atmospheric circulation followed by persistent favorable conditions: ocean surface currents are found to have a small but positive impact on the simulation which is enhanced when wind forcing is weak or variable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to evaluate the frequency and clinical associations of HLA-DR alleles in Brazilian Caucasian patients with polyarteritis nodosa (PAN) or microscopic polyangiitis (MPA). We evaluated 29 Caucasian patients with vasculitis classified as PAN or MPA according to the American College of Rheumatology (ACR) 1990 Criteria, Chapel Hill Consensus Conference (CHCC) nomenclature for vasculitis and EULAR recommendations for conducting clinical studies in systemic vasculitis. HLA-DR alleles were typed using polymerase chain reaction-amplified DNA, hybridized with sequence-specific low resolution primers. DNA obtained from 59 Caucasian healthy blood donors were used as control. In order to evaluate if a specific HLA may have influence on the clinical profile of those diseases, we also divided the patients according to Birmingham vasculitis score (BVAS) and Five-Factors Score (FFS) at the time of diagnosis. Increased frequency of HLA-DRB1*16 (p = 0.023) and DRB4*01 (p = 0.048) was found in patients with higher disease activity at the time of diagnosis (BVAS >= 22). Patients with less severe disease (FFS = 0) had a higher frequency of HLA-DRB1*03 (p = 0.011). Patients with gastrointestinal tract involvement had significantly increased frequency of HLA-DRB1*11 or B1*12 (p = 0.046), B1*13 (p = 0.021) and B3 (p = 0.008). In contrast, patients with renal disease, had higher frequency of DRB1*15 or DRB1*16 (p = 0.035) and B5 (p = 0.035). In the subgroup of patients with MPA, increased frequency of HLA-DRB1*15 was found in patients with BVAS >= 22 (p = 0.038) and FFS >= 1 (p = 0.039) suggesting that this allele is associated with more aggressive disease. Antineutrophil cytoplasmic antibodies (ANCA) negative MPA patients had significantly increased frequency of HLA-DRB1*11 or DRB1*12 when compared to ANCA positive patients (p = 0.023). Our results suggest that HLA-DR alleles may influence PAN and MPA clinical expression and outcome and that in MPA they participate in the mechanisms involved in the development to ANCA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human leukocyte antigen (HLA) haplotypes are frequently evaluated for population history inferences and association studies. However, the available typing techniques for the main HLA loci usually do not allow the determination of the allele phase and the constitution of a haplotype, which may be obtained by a very time-consuming and expensive family-based segregation study. Without the family-based study, computational inference by probabilistic models is necessary to obtain haplotypes. Several authors have used the expectation-maximization (EM) algorithm to determine HLA haplotypes, but high levels of erroneous inferences are expected because of the genetic distance among the main HLA loci and the presence of several recombination hotspots. In order to evaluate the efficiency of computational inference methods, 763 unrelated individuals stratified into three different datasets had their haplotypes manually defined in a family-based study of HLA-A, -B, -DRB1 and -DQB1 segregation, and these haplotypes were compared with the data obtained by the following three methods: the Expectation-Maximization (EM) and Excoffier-Laval-Balding (ELB) algorithms using the arlequin 3.11 software, and the PHASE method. When comparing the methods, we observed that all algorithms showed a poor performance for haplotype reconstruction with distant loci, estimating incorrect haplotypes for 38%-57% of the samples considering all algorithms and datasets. We suggest that computational haplotype inferences involving low-resolution HLA-A, HLA-B, HLA-DRB1 and HLA-DQB1 haplotypes should be considered with caution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To test a mathematical model for measuring blinking kinematics. Spontaneous and reflex blinks of 23 healthy subjects were recorded with two different temporal resolutions. A magnetic search coil was used to record 77 blinks sampled at 200 Hz and 2 kHz in 13 subjects. A video system with low temporal resolution (30 Hz) was employed to register 60 blinks of 10 other subjects. The experimental data points were fitted with a model that assumes that the upper eyelid movement can be divided into two parts: an impulsive accelerated motion followed by a damped harmonic oscillation. All spontaneous and reflex blinks, including those recorded with low resolution, were well fitted by the model with a median coefficient of determination of 0.990. No significant difference was observed when the parameters of the blinks were estimated with the under-damped or critically damped solutions of the harmonic oscillator. On the other hand, the over-damped solution was not applicable to fit any movement. There was good agreement between the model and numerical estimation of the amplitude but not of maximum velocity. Spontaneous and reflex blinks can be mathematically described as consisting of two different phases. The down-phase is mainly an accelerated movement followed by a short time that represents the initial part of the damped harmonic oscillation. The latter is entirely responsible for the up-phase of the movement. Depending on the instantaneous characteristics of each movement, the under-damped or critically damped oscillation is better suited to describe the second phase of the blink. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Linkage disequilibrium (LD) mapping is commonly used as a fine mapping tool in human genome mapping and has been used with some success for initial disease gene isolation in certain isolated inbred human populations. An understanding of the population history of domestic dog breeds suggests that LID mapping could be routinely utilized in this species for initial genome-wide scans. Such an approach offers significant advantages over traditional linkage analysis. Here, we demonstrate, using canine copper toxicosis in the Bedlington terrier as the model, that LID mapping could be reasonably expected to be a useful strategy in low-resolution, genome-wide scans in pure-bred dogs. Significant LID was demonstrated over distances up to 33.3 cM. It is very unlikely, for a number of reasons discussed, that this result could be extrapolated to the rest of the genome. It is, however, consistent with the expectation given the population structure of canine breeds and, in this breed at least, with the hypothesis that it may be possible to utilize LID in a genome-wide scan. In this study, LD mapping confirmed the location of the copper toxicosis in Bedlington terrier gene (CT-BT) and was able to do so in a population that was refractory to traditional linkage analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

15th IEEE International Conference on Electronics, Circuits and Systems, Malta

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Os métodos utilizados pela Medicina moderna no âmbito da Imagem Molecular e na sua capacidade de diagnosticar a partir da “Função do Orgão” em vez da “Morfologia do Orgão”, vieram trazer á componente fundamental desta modalidade da Imagiologia Médica – A Medicina Nuclear – uma importância acrescida, que se tem traduzido num aumento significativo no recurso á sua utilização nas diferentes formas das suas aplicações clínicas. Para além dos aspectos meramente clínicos, que só por si seriam suficientes para ocupar várias dissertações como a presente; a própria natureza desta técnica de imagem, com a sua inerente baixa resolução e tempos longos de aquisição, vieram trazer preocupações acrescidas quanto ás questões relacionadas com a produtividade (nº de estudos a realizar por unidade de tempo); com a qualidade (aumento da resolução da imagem obtida) e, com os níveis de actividade radioactiva injectada nos pacientes (dose de radiação efectiva sobre as populações). Conhecidas que são então as limitações tecnológicas associadas ao desenho dos equipamentos destinados á aquisição de dados em Medicina Nuclear, que apesar dos avanços introduzidos, mantêm mais ou menos inalteráveis os conceitos base de funcionamento de uma Câmara Gama, imaginou-se a alteração significativa dos parâmetros de aquisição (tempo, resolução, actividade), actuando não ao nível das condições técnico-mecânicas dessa aquisição, mas essencialmente ao nível do pós-processamento dos dados adquiridos segundo os métodos tradicionais e que ainda constituem o estado da arte desta modalidade. Este trabalho tem então como objectivo explicar por um lado, com algum pormenor, as bases tecnológicas que desde sempre têm suportado o funcionamento dos sistemas destinados á realização de exames de Medicina Nuclear, mas sobretudo, apresentar as diferenças com os inovadores métodos, que aplicando essencialmente conhecimento (software), permitiram responder ás questões acima levantadas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical and Computer Engineering of the Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge of the spatial distribution of hydraulic conductivity (K) within an aquifer is critical for reliable predictions of solute transport and the development of effective groundwater management and/or remediation strategies. While core analyses and hydraulic logging can provide highly detailed information, such information is inherently localized around boreholes that tend to be sparsely distributed throughout the aquifer volume. Conversely, larger-scale hydraulic experiments like pumping and tracer tests provide relatively low-resolution estimates of K in the investigated subsurface region. As a result, traditional hydrogeological measurement techniques contain a gap in terms of spatial resolution and coverage, and they are often alone inadequate for characterizing heterogeneous aquifers. Geophysical methods have the potential to bridge this gap. The recent increased interest in the application of geophysical methods to hydrogeological problems is clearly evidenced by the formation and rapid growth of the domain of hydrogeophysics over the past decade (e.g., Rubin and Hubbard, 2005).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A 0.125 degree raster or grid-based Geographic Information System with data on tsetse, trypanosomosis, animal production, agriculture and land use has recently been developed in Togo. This paper addresses the problem of generating tsetse distribution and abundance maps from remotely sensed data, using a restricted amount of field data. A discriminant analysis model is tested using contemporary tsetse data and remotely sensed, low resolution data acquired from the National Oceanographic and Atmospheric Administration and Meteosat platforms. A split sample technique is adopted where a randomly selected part of the field measured data (training set) serves to predict the other part (predicted set). The obtained results are then compared with field measured data per corresponding grid-square. Depending on the size of the training set the percentage of concording predictions varies from 80 to 95 for distribution figures and from 63 to 74 for abundance. These results confirm the potential of satellite data application and multivariate analysis for the prediction, not only of the tsetse distribution, but more importantly of their abundance. This opens up new avenues because satellite predictions and field data may be combined to strengthen or substitute one another and thus reduce costs of field surveys.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la Institute of mineralogy and geochemistry de la University of Lausanne, Suïssa, entre 2007 i 2009. Durant l’última dècada, la comunitat científica ha reconegut que les zones tropicals juguen un paper clau en els processos dinàmics que controlen el canvi climàtic global, probablement com a desencadenant dels canvis succeïts en altes latituds. A més a més, els sediments dels oceans tropicals, en trobar-se fora de l’impacte directe de les plaques de gel continentals creades durant les glaciacions, proporcionen un registre continu de les variacions climàtiques del planeta. Malgrat tot, encara hi ha moltes incògnites sobre el paper específic de les zones tropicals, especialment pel que fa a les variacions brusques suborbitals, degut als pocs registres d’alta resolució estudiats en aquestes àrees que abastin varis cicles glacial/interglacial. Per tal d’ajudar a clarificar el paper de les zones tropicals de l’hemisferi sud en el control del clima a escala mil•lenària s’ha estudiat la distribució i la composició isotòpica de biomarcadors moleculars marins i terrestres, a baixa resolució, en el testimoni MD98-2165 (9º39’S, 118º20’E, 2100 m de profunditat d’aigua, 42.3 m de llarg) està situat al sud-oest d’Indonèsia, on s’enregistren les temperatures superficials del mar més elevades del planeta i una elevada activitat convectiva, que té una influència en la distribució de la humitat atmosfèrica en una extensa superfície de la Terra. Les distribucions observades de biomarcadors terrígens (C23-C33 n-alcans i C20-C32 n-alcan-1-ols) són típiques del lipids de plantes superiors que arriben a l’oceà principalment per via eòlica. L’alcà de 31 àtoms de carboni i els alcohols de 28 o 32 àtoms de carboni són els homòlegs més abundants en ambdós testimonis. Cal destacar l’alcohol C32 com a homòleg principal durant les èpoques glacials, tot suggerint una expansió de les plantes tropicals C4 associada a unes condicions més àrides. La procedència d’aquests lipids queda corroborada mitjançant la seva composició isotòpica de carboni, que ens permet diferenciar la ruta fotosintètica emprada i per tant, entre el tipus de plantes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Summary of Findings (PDF 9.4mb) Alongside the executive summary above, this report is further broken into 3 technical reports and an appendix, which are available below. Because of their size, Technical Reports 2 and 3 are available in low-resolution format and are also broken into 4-part higher resolution versions. Technical Report 1 features the findings of the Census of Traveller Population and a Quantitative Study of Health Status and Health Utilisation Technical Report 1: Health Survey Findings (PDF 10mb) Technical Report 2 reports on Demography and Vital Statistics including mortality and life expectancy data, an initial report of the Birth Cohort Study and a report on Travellers in Institutions. The Birth Cohort Study was a 1 year follow-up of all Traveller babies born on the island of Ireland between 14th October 2008 and 13th October 2009, with data collection up to 13th October 2010. Part D of Technical Report 2 is the Birth Cohort Study Follow Up and was published in September 2011. Technical Report 2 – Parts A, B & C (PDF 12mb) Demography & Vital Statistics: Part A of Technical Report 2 (PDF 5.3mb) The Birth Cohort Study: Part B of Technical Report 2 (PDF 9.6mb) Travellers in Institutions: Part C of Technical Report 2 (PDF 4.3mb) Technical Report 2 Bibliography – Parts A, B & C (PDF 2.7mb) The Birth Cohort Study Follow Up: Part D of Technical Report 2 (including bibliography) (PDF 7.1mb) Technical Report 3 reports on Consultative Studies including qualitative studies based on focus groups and semi-structured interviews with Travellers and key discussants, and a survey of Health Service Providers Technical Report 3 : Full Report (PDF 11.8mb) Qualitative Studies: Part A of Technical Report 3 (PDF 4.2mb) Health Service Provider Study: Part B of Technical Report 3 (PDF 5.4mb) Discussion & Recommendations: Part C of Technical Report 3 (PDF 3.1mb) Technical Report 3 Bibliography (PDF 2.6mb) Preamble Health Service Providers Questionnaire for the Republic of Ireland and Northern Ireland (PDF 75kb) Questionnaire for the Republic of Ireland (PDF 326kb) Questionnaire for Northern Ireland (PDF 140kb)