965 resultados para FINAL DATA RELEASE
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
Production Planning and Control (PPC) systems have grown and changed because of the developments in planning tools and models as well as the use of computers and information systems in this area. Though so much is available in research journals, practice of PPC is lagging behind and does not use much from published research. The practices of PPC in SMEs lag behind because of many reasons, which need to be explored. This research work deals with the effect of identified variables such as forecasting, planning and control methods adopted, demographics of the key person, standardization practices followed, effect of training, learning and IT usage on firm performance. A model and framework has been developed based on literature. Empirical testing of the model has been done after collecting data using a questionnaire schedule administered among the selected respondents from Small and Medium Enterprises (SMEs) in India. Final data included 382 responses. Hypotheses linking SME performance with the use of forecasting, planning and controlling were formed and tested. Exploratory factor analysis was used for data reduction and for identifying the factor structure. High and low performing firms were classified using a Logistic Regression model. A confirmatory factor analysis was used to study the structural relationship between firm performance and dependent variables.
Resumo:
This work is aimed at understanding and unifying information on epidemiological modelling methods and how those methods relate to public policy addressing human health, specifically in the context of infectious disease prevention, pandemic planning, and health behaviour change. This thesis employs multiple qualitative and quantitative methods, and presents as a manuscript of several individual, data-driven projects that are combined in a narrative arc. The first chapter introduces the scope and complexity of this interdisciplinary undertaking, describing several topical intersections of importance. The second chapter begins the presentation of original data, and describes in detail two exercises in computational epidemiological modelling pertinent to pandemic influenza planning and policy, and progresses in the next chapter to present additional original data on how the confidence of the public in modelling methodology may have an effect on their planned health behaviour change as recommended in public health policy. The thesis narrative continues in the final data-driven chapter to describe how health policymakers use modelling methods and scientific evidence to inform and construct health policies for the prevention of infectious diseases, and concludes with a narrative chapter that evaluates the breadth of this data and recommends strategies for the optimal use of modelling methodologies when informing public health policy in applied public health scenarios.
Resumo:
BACKGROUND Canine inflammatory bowel disease (IBD) is a chronic enteropathy of unknown etiology, although microbiome dysbiosis, genetic susceptibility, and dietary and/or environmental factors are hypothesized to be involved in its pathogenesis. Since some of the current therapies are associated with severe side effects, novel therapeutic modalities are needed. A new oral supplement for long-term management of canine IBD containing chondroitin sulfate (CS) and prebiotics (resistant starch, β-glucans and mannaoligosaccharides) was developed to target intestinal inflammation and oxidative stress, and restore normobiosis, without exhibiting any side effects. This double-blinded, randomized, placebo-controlled trial in dogs with IBD aims to evaluate the effects of 180 days administration of this supplement together with a hydrolyzed diet on clinical signs, intestinal histology, gut microbiota, and serum biomarkers of inflammation and oxidative stress. RESULTS Twenty-seven client-owned biopsy-confirmed IBD dogs were included in the study, switched to the same hydrolyzed diet and classified into one of two groups: supplement and placebo. Initially, there were no significant differences between groups (p > 0.05) for any of the studied parameters. Final data analysis (supplement: n = 9; placebo: n = 10) showed a significant decrease in canine IBD activity index (CIBDAI) score in both groups after treatment (p < 0.001). After treatment, a significant decrease (1.53-fold; p < 0.01) in histologic score was seen only in the supplement group. When groups were compared, the supplement group showed significantly higher serum cholesterol (p < 0.05) and paraoxonase-1 (PON1) levels after 60 days of treatment (p < 0.01), and the placebo group showed significantly reduced serum total antioxidant capacity (TAC) levels after 120 days (p < 0.05). No significant differences were found between groups at any time point for CIBDAI, WSAVA histologic score and fecal microbiota evaluated by PCR-restriction fragment length polymorphism (PCR-RFLP). No side effects were reported in any group. CONCLUSIONS The combined administration of the supplement with hydrolyzed diet over 180 days was safe and induced improvements in selected serum biomarkers, possibly suggesting a reduction in disease activity. This study was likely underpowered, therefore larger studies are warranted in order to demonstrate a supplemental effect to dietary treatment of this supplement on intestinal histology and CIBDAI.
Resumo:
Turbulence introduced into the intra-cluster medium (ICM) through cluster merger events transfers energy to non-thermal components (relativistic particles and magnetic fields) and can trigger the formation of diffuse synchrotron radio sources. Owing to their steep synchrotron spectral index, such diffuse sources can be better studied at low radio frequencies. In this respect, the LOw Frequency ARray (LOFAR) is revolutionizing our knowledge thanks to its unprecedented resolution and sensitivity below 200 MHz. In this Thesis we focus on the study of radio halos (RHs) by using LOFAR data. In the first part of this work we analyzed the largest-ever sample of galaxy clusters observed at radio frequencies. This includes 309 Planck clusters from the Second Data Release of the LOFAR Two Metre Sky Survey (LoTSS-DR2), which span previously unexplored ranges of mass and redshift. We detected 83 RHs, half of which being new discoveries. In 140 clusters we lack a detected RH; for this sub-sample we developed new techniques to derive upper limits to their radio powers. By comparing detections and upper limits, we carried out the first statistical analysis of populations of clusters observed at low frequencies and tested theoretical formation models. In the second part of this Thesis we focused on ultra-steep spectrum radio halos. These sources are almost undetected at GHz frequencies, but are thought to be common at low frequencies. We presented LOFAR observations of two interesting clusters hosting ultra-steep spectrum radio halos. With complementary radio and X-ray observations we constrained the properties and origin of these targets.
Resumo:
In this Thesis, we present a series of works that encompass the fundamental steps of cosmological analyses based on galaxy clusters, spanning from mass calibration to deriving cosmological constraints through counts and clustering. Firstly, we focus on the 3D two-point correlation function (2PCF) of the galaxy cluster sample by Planck Collaboration XXVII (2016). The masses of these clusters are expected to be underestimated, as they are derived from a scaling relation calibrated through X-ray observations. We derived a mass bias which disagrees with simulation predictions, consistent with what derived by Planck Collaboration VI (2020). Furthermore, in this Thesis we analyse the cluster counts and 2PCF, respectively, of the photometric galaxy cluster sample developed by Maturi et al. (2019), based on the third data release of KiDS (KiDS-DR3, de Jong et al. 2017). We derived constraints on fundamental cosmological parameters which are consistent and competitive, in terms of uncertainties, with other state-of-the-art cosmological analyses. Then, we introduce a novel approach to establish galaxy colour-redshift relations for cluster weak-lensing analyses, regardless of the specific photometric bands in use. This method optimises the selection completeness of cluster background galaxies while maintaining a defined purity threshold. Based on the galaxy sample by Bisigello et al. (2020), we calibrated two colour selections, one relying on the ground-based griz bands, and the other including the griz and Euclid YJH bands. In addition, we present the preliminary work on the weak-lensing mass calibration of the clusters detected by Maturi et al. (in prep.) in the fourth data release of KiDS (KiDS-1000, Kuijken et al. 2019). This mass calibration will enable the cosmological analyses based on cluster counts and clustering, from which we expect remarkable improvements in the results compared to those derived in KiDS-DR3.
Resumo:
We present a georeferenced photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37°18’N). The photomosaic was generated from digital photographs acquired using the ARGO II seafloor imaging system during the 1996 LUSTRE cruise, which surveyed a ~1 km2 zone and provided a coverage of ~20% of the seafloor. The photomosaic has a pixel resolution of 15 mm and encloses the areas with known active hydrothermal venting. The final mosaic is generated after an optimization that includes the automatic detection of the same benthic features across different images (feature-matching), followed by a global alignment of images based on the vehicle navigation. We also provide software to construct mosaics from large sets of images for which georeferencing information exists (location, attitude, and altitude per image), to visualize them, and to extract data. Georeferencing information can be provided by the raw navigation data (collected during the survey) or result from the optimization obtained from imatge matching. Mosaics based solely on navigation can be readily generated by any user but the optimization and global alignment of the mosaic requires a case-by-case approach for which no universally software is available. The Lucky Strike photomosaics (optimized and navigated-only) are publicly available through the Marine Geoscience Data System (MGDS, http://www.marine-geo.org). The mosaic-generating and viewing software is available through the Computer Vision and Robotics Group Web page at the University of Girona (http://eia.udg.es/_rafa/mosaicviewer.html)
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies