929 resultados para Source analysis
Resumo:
Most studies that apply qualitative comparative analysis (QCA) rely on macro-level data, but an increasing number of studies focus on units of analysis at the micro or meso level (i.e., households, firms, protected areas, communities, or local governments). For such studies, qualitative interview data are often the primary source of information. Yet, so far no procedure is available describing how to calibrate qualitative data as fuzzy sets. The authors propose a technique to do so and illustrate it using examples from a study of Guatemalan local governments. By spelling out the details of this important analytic step, the authors aim at contributing to the growing literature on best practice in QCA. © The Author(s) 2012.
Resumo:
Ongoing Cryptococcus gattii outbreaks in the Western United States and Canada illustrate the impact of environmental reservoirs and both clonal and recombining propagation in driving emergence and expansion of microbial pathogens. C. gattii comprises four distinct molecular types: VGI, VGII, VGIII, and VGIV, with no evidence of nuclear genetic exchange, indicating these represent distinct species. C. gattii VGII isolates are causing the Pacific Northwest outbreak, whereas VGIII isolates frequently infect HIV/AIDS patients in Southern California. VGI, VGII, and VGIII have been isolated from patients and animals in the Western US, suggesting these molecular types occur in the environment. However, only two environmental isolates of C. gattii have ever been reported from California: CBS7750 (VGII) and WM161 (VGIII). The incongruence of frequent clinical presence and uncommon environmental isolation suggests an unknown C. gattii reservoir in California. Here we report frequent isolation of C. gattii VGIII MATα and MATa isolates and infrequent isolation of VGI MATα from environmental sources in Southern California. VGIII isolates were obtained from soil debris associated with tree species not previously reported as hosts from sites near residences of infected patients. These isolates are fertile under laboratory conditions, produce abundant spores, and are part of both locally and more distantly recombining populations. MLST and whole genome sequence analysis provide compelling evidence that these environmental isolates are the source of human infections. Isolates displayed wide-ranging virulence in macrophage and animal models. When clinical and environmental isolates with indistinguishable MLST profiles were compared, environmental isolates were less virulent. Taken together, our studies reveal an environmental source and risk of C. gattii to HIV/AIDS patients with implications for the >1,000,000 cryptococcal infections occurring annually for which the causative isolate is rarely assigned species status. Thus, the C. gattii global health burden could be more substantial than currently appreciated.
Resumo:
Early interventions are a preferred method for addressing behavioral problems in high-risk children, but often have only modest effects. Identifying sources of variation in intervention effects can suggest means to improve efficiency. One potential source of such variation is the genome. We conducted a genetic analysis of the Fast Track randomized control trial, a 10-year-long intervention to prevent high-risk kindergarteners from developing adult externalizing problems including substance abuse and antisocial behavior. We tested whether variants of the glucocorticoid receptor gene NR3C1 were associated with differences in response to the Fast Track intervention. We found that in European-American children, a variant of NR3C1 identified by the single-nucleotide polymorphism rs10482672 was associated with increased risk for externalizing psychopathology in control group children and decreased risk for externalizing psychopathology in intervention group children. Variation in NR3C1 measured in this study was not associated with differential intervention response in African-American children. We discuss implications for efforts to prevent externalizing problems in high-risk children and for public policy in the genomic era.
Resumo:
The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy) mutant is short and fat and that a Long (Lon) mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma) mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc) and Roller (Rol) mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.
Resumo:
MOTIVATION: Technological advances that allow routine identification of high-dimensional risk factors have led to high demand for statistical techniques that enable full utilization of these rich sources of information for genetics studies. Variable selection for censored outcome data as well as control of false discoveries (i.e. inclusion of irrelevant variables) in the presence of high-dimensional predictors present serious challenges. This article develops a computationally feasible method based on boosting and stability selection. Specifically, we modified the component-wise gradient boosting to improve the computational feasibility and introduced random permutation in stability selection for controlling false discoveries. RESULTS: We have proposed a high-dimensional variable selection method by incorporating stability selection to control false discovery. Comparisons between the proposed method and the commonly used univariate and Lasso approaches for variable selection reveal that the proposed method yields fewer false discoveries. The proposed method is applied to study the associations of 2339 common single-nucleotide polymorphisms (SNPs) with overall survival among cutaneous melanoma (CM) patients. The results have confirmed that BRCA2 pathway SNPs are likely to be associated with overall survival, as reported by previous literature. Moreover, we have identified several new Fanconi anemia (FA) pathway SNPs that are likely to modulate survival of CM patients. AVAILABILITY AND IMPLEMENTATION: The related source code and documents are freely available at https://sites.google.com/site/bestumich/issues. CONTACT: yili@umich.edu.
Resumo:
The outcomes for both (i) radiation therapy and (ii) preclinical small animal radio- biology studies are dependent on the delivery of a known quantity of radiation to a specific and intentional location. Adverse effects can result from these procedures if the dose to the target is too high or low, and can also result from an incorrect spatial distribution in which nearby normal healthy tissue can be undesirably damaged by poor radiation delivery techniques. Thus, in mice and humans alike, the spatial dose distributions from radiation sources should be well characterized in terms of the absolute dose quantity, and with pin-point accuracy. When dealing with the steep spatial dose gradients consequential to either (i) high dose rate (HDR) brachytherapy or (ii) within the small organs and tissue inhomogeneities of mice, obtaining accurate and highly precise dose results can be very challenging, considering commercially available radiation detection tools, such as ion chambers, are often too large for in-vivo use.
In this dissertation two tools are developed and applied for both clinical and preclinical radiation measurement. The first tool is a novel radiation detector for acquiring physical measurements, fabricated from an inorganic nano-crystalline scintillator that has been fixed on an optical fiber terminus. This dosimeter allows for the measurement of point doses to sub-millimeter resolution, and has the ability to be placed in-vivo in humans and small animals. Real-time data is displayed to the user to provide instant quality assurance and dose-rate information. The second tool utilizes an open source Monte Carlo particle transport code, and was applied for small animal dosimetry studies to calculate organ doses and recommend new techniques of dose prescription in mice, as well as to characterize dose to the murine bone marrow compartment with micron-scale resolution.
Hardware design changes were implemented to reduce the overall fiber diameter to <0.9 mm for the nano-crystalline scintillator based fiber optic detector (NanoFOD) system. Lower limits of device sensitivity were found to be approximately 0.05 cGy/s. Herein, this detector was demonstrated to perform quality assurance of clinical 192Ir HDR brachytherapy procedures, providing comparable dose measurements as thermo-luminescent dosimeters and accuracy within 20% of the treatment planning software (TPS) for 27 treatments conducted, with an inter-quartile range ratio to the TPS dose value of (1.02-0.94=0.08). After removing contaminant signals (Cerenkov and diode background), calibration of the detector enabled accurate dose measurements for vaginal applicator brachytherapy procedures. For 192Ir use, energy response changed by a factor of 2.25 over the SDD values of 3 to 9 cm; however a cap made of 0.2 mm thickness silver reduced energy dependence to a factor of 1.25 over the same SDD range, but had the consequence of reducing overall sensitivity by 33%.
For preclinical measurements, dose accuracy of the NanoFOD was within 1.3% of MOSFET measured dose values in a cylindrical mouse phantom at 225 kV for x-ray irradiation at angles of 0, 90, 180, and 270˝. The NanoFOD exhibited small changes in angular sensitivity, with a coefficient of variation (COV) of 3.6% at 120 kV and 1% at 225 kV. When the NanoFOD was placed alongside a MOSFET in the liver of a sacrificed mouse and treatment was delivered at 225 kV with 0.3 mm Cu filter, the dose difference was only 1.09% with use of the 4x4 cm collimator, and -0.03% with no collimation. Additionally, the NanoFOD utilized a scintillator of 11 µm thickness to measure small x-ray fields for microbeam radiation therapy (MRT) applications, and achieved 2.7% dose accuracy of the microbeam peak in comparison to radiochromic film. Modest differences between the full-width at half maximum measured lateral dimension of the MRT system were observed between the NanoFOD (420 µm) and radiochromic film (320 µm), but these differences have been explained mostly as an artifact due to the geometry used and volumetric effects in the scintillator material. Characterization of the energy dependence for the yttrium-oxide based scintillator material was performed in the range of 40-320 kV (2 mm Al filtration), and the maximum device sensitivity was achieved at 100 kV. Tissue maximum ratio data measurements were carried out on a small animal x-ray irradiator system at 320 kV and demonstrated an average difference of 0.9% as compared to a MOSFET dosimeter in the range of 2.5 to 33 cm depth in tissue equivalent plastic blocks. Irradiation of the NanoFOD fiber and scintillator material on a 137Cs gamma irradiator to 1600 Gy did not produce any measurable change in light output, suggesting that the NanoFOD system may be re-used without the need for replacement or recalibration over its lifetime.
For small animal irradiator systems, researchers can deliver a given dose to a target organ by controlling exposure time. Currently, researchers calculate this exposure time by dividing the total dose that they wish to deliver by a single provided dose rate value. This method is independent of the target organ. Studies conducted here used Monte Carlo particle transport codes to justify a new method of dose prescription in mice, that considers organ specific doses. Monte Carlo simulations were performed in the Geant4 Application for Tomographic Emission (GATE) toolkit using a MOBY mouse whole-body phantom. The non-homogeneous phantom was comprised of 256x256x800 voxels of size 0.145x0.145x0.145 mm3. Differences of up to 20-30% in dose to soft-tissue target organs was demonstrated, and methods for alleviating these errors were suggested during whole body radiation of mice by utilizing organ specific and x-ray tube filter specific dose rates for all irradiations.
Monte Carlo analysis was used on 1 µm resolution CT images of a mouse femur and a mouse vertebra to calculate the dose gradients within the bone marrow (BM) compartment of mice based on different radiation beam qualities relevant to x-ray and isotope type irradiators. Results and findings indicated that soft x-ray beams (160 kV at 0.62 mm Cu HVL and 320 kV at 1 mm Cu HVL) lead to substantially higher dose to BM within close proximity to mineral bone (within about 60 µm) as compared to hard x-ray beams (320 kV at 4 mm Cu HVL) and isotope based gamma irradiators (137Cs). The average dose increases to the BM in the vertebra for these four aforementioned radiation beam qualities were found to be 31%, 17%, 8%, and 1%, respectively. Both in-vitro and in-vivo experimental studies confirmed these simulation results, demonstrating that the 320 kV, 1 mm Cu HVL beam caused statistically significant increased killing to the BM cells at 6 Gy dose levels in comparison to both the 320 kV, 4 mm Cu HVL and the 662 keV, 137Cs beams.
Resumo:
The Aircraft Accident Statistics and Knowledge (AASK) database is a repository of survivor accounts from aviation accidents. Its main purpose is to store observational and anecdotal data from the actual interviews of the occupants involved in aircraft accidents. The database has wide application to aviation safety analysis, being a source of factual data regarding the evacuation process. It is also key to the development of aircraft evacuation models such as airEXODUS, where insight into how people actually behave during evacuation from survivable aircraft crashes is required. This paper describes recent developments with the database leading to the development of AASK v3.0. These include significantly increasing the number of passenger accounts in the database, the introduction of cabin crew accounts, the introduction of fatality information, improved functionality through the seat plan viewer utility and improved ease of access to the database via the internet. In addition, the paper demonstrates the use of the database by investigating a number of important issues associated with aircraft evacuation. These include issues associated with social bonding and evacuation, the relationship between the number of crew and evacuation efficiency, frequency of exit/slide failures in accidents and exploring possible relationships between seating location and chances of survival. Finally, the passenger behavioural trends described in analysis undertaken with the earlier database are confirmed with the wider data set.
Resumo:
Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.
Resumo:
We explore the potential application of cognitive interrogator network (CIN) in remote monitoring of mobile subjects in domestic environments, where the ultra-wideband radio frequency identification (UWB-RFID) technique is considered for accurate source localization. We first present the CIN architecture in which the central base station (BS) continuously and intelligently customizes the illumination modes of the distributed transceivers in response to the systempsilas changing knowledge of the channel conditions and subject movements. Subsequently, the analytical results of the locating probability and time-of-arrival (TOA) estimation uncertainty for a large-scale CIN with randomly distributed interrogators are derived based upon the implemented cognitive intelligences. Finally, numerical examples are used to demonstrate the key effects of the proposed cognitions on the system performance
Resumo:
During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.
Resumo:
Coccolithophores are the largest source of calcium carbonate in the oceans and are considered to play an important role in oceanic carbon cycles. Current methods to detect the presence of coccolithophore blooms from Earth observation data often produce high numbers of false positives in shelf seas and coastal zones due to the spectral similarity between coccolithophores and other suspended particulates. Current methods are therefore unable to characterise the bloom events in shelf seas and coastal zones, despite the importance of these phytoplankton in the global carbon cycle. A novel approach to detect the presence of coccolithophore blooms from Earth observation data is presented. The method builds upon previous optical work and uses a statistical framework to combine spectral, spatial and temporal information to produce maps of coccolithophore bloom extent. Validation and verification results for an area of the north east Atlantic are presented using an in situ database (N = 432) and all available SeaWiFS data for 2003 and 2004. Verification results show that the approach produces a temporal seasonal signal consistent with biological studies of these phytoplankton. Validation using the in situ coccolithophore cell count database shows a high correct recognition rate of 80% and a low false-positive rate of 0.14 (in comparison to 63% and 0.34 respectively for the established, purely spectral approach). To guide its broader use, a full sensitivity analysis for the algorithm parameters is presented.
Resumo:
Carbon and nitrogen stable isotope ratios of amino acids (δ13CAA and δ15NAA) have been recently used to unravel trophic relationships in aquatic and terrestrial environments. However, none have studied the specific case of a symbiotic relationship. Here we use the stable isotope ratios of amino acids (AAs) to investigate the link between a scarab larva (Pericoptustruncatus) and its mite guest (Mumulaelaps, Mesostigmata: Laelapidae: Hypoaspidini). Five scenarios for the relationship between larva and mite were proposed and δ13CAA and δ15NAA respective data and patterns helped eliminate those that were inconsistent. The calculated gap of two trophic levels ruled out a parasitic trophic relationship scenario. The trophic relationship between P. truncatus was shown to most likely be commensalistic with the mites feeding on the larva's castings. Alongside this study, a comparison with the stable isotope bulk analysis method was made and demonstrated that the AA method brings a significant refinement to the results by providing a means of determining absolute tropic level without the need for prior knowledge of the isotopic composition of primary source material.
Resumo:
Aims/Hypothesis: To describe the epidemiology of childhood-onset Type 1 (insulin-dependent) diabetes in Europe, the EURODIAB collaborative group has established prospective, geographically-defined registers of children diagnosed under 15 years. A total of 16,362 cases were registered by 44 centres during the period 1989-1994. The registers cover a population of approximately 28 million children with most European countries represented. Methods In most centres a primary and a secondary source of ascertainment were used so that the completeness of registration could be assessed by the capture-recapture method. Ecological correlation and regression analyses were used to study the relationship between incidence and various environmental, health and economic indicators. Findings: The standardised average annual incidence rate during the period 1989-94 ranged from 3.2 cases per 100,000 per annum in the Former Yugoslavian Republic of Macedonia to 40.2 cases per 100,000 per annum in Finland. Indicators of national prosperity such as infant mortality (r= -0.64) and gross domestic product (r= 0.58) were most strongly and significantly correlated with incidence rate, and previously-reported associations with coffee consumption (r= 0.51), milk consumption (r= 0.58) and latitude (r= 0.40) were also observed. Conclusion/Interpretation: The wide variation in childhood type 1 diabetes incidence rates within Europe could be partially explained by indicators of national prosperity. These indicators could reflect differences in environmental risk factors such as nutrition or lifestyle that are important in determining a country's incidence rate.
Resumo:
Damping torque analysis is a well-developed technique for understanding and studying power system oscillations. This paper presents the applications of damping torque analysis for DC bus implemented damping control in power transmission networks in two examples. The first example is the investigation of damping effect of shunt VSC (Voltage Source Converter) based FACTS voltage control, i.e., STATCOM (Static Synchronous Compensator) voltage control. It is shown in the paper that STATCOM voltage control mainly contributes synchronous torque and hence has little effect on the damping of power system oscillations. The second example is the damping control implemented by a Battery Energy Storage System (BESS) installed in a power system. Damping torque analysis reveals that when BESS damping control is realized by regulating exchange of active and reactive power between the BESS and power system respectively, BESS damping control exhibits different properties. It is concluded by damping torque analysis that BESS damping control implemented by regulating active power is better with less interaction with BESS voltage control and more robust to variations of power system operating conditions. In the paper, all analytical conclusions obtained are demonstrated by simulation results of example power systems.
Resumo:
In this letter, a standard postnonlinear blind source separation algorithm is proposed, based on the MISEP method, which is widely used in linear and nonlinear independent component analysis. To best suit a wide class of postnonlinear mixtures, we adapt the MISEP method to incorporate a priori information of the mixtures. In particular, a group of three-layered perceptrons and a linear network are used as the unmixing system to separate sources in the postnonlinear mixtures, and another group of three-layered perceptron is used as the auxiliary network. The learning algorithm for the unmixing system is then obtained by maximizing the output entropy of the auxiliary network. The proposed method is applied to postnonlinear blind source separation of both simulation signals and real speech signals, and the experimental results demonstrate its effectiveness and efficiency in comparison with existing methods.