899 resultados para Speed and torque observers


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this thesis were to investigate the neuropsychological, neurophysiological, and cognitive contributors to mobility changes with increasing age. In a series of studies with adults aged 45-88 years, unsafe pedestrian behaviour and falls were investigated in relation to i) cognitive functions (including response time variability, executive function, and visual attention tests), ii) mobility assessments (including gait and balance and using motion capture cameras), iii) motor initiation and pedestrian road crossing behavior (using a simulated pedestrian road scene), iv) neuronal and functional brain changes (using a computer based crossing task with magnetoencephalography), and v) quality of life questionnaires (including fear of falling and restricted range of travel). Older adults are more likely to be fatally injured at the far-side of the road compared to the near-side of the road, however, the underlying mobility and cognitive processes related to lane-specific (i.e. near-side or far-side) pedestrian crossing errors in older adults is currently unknown. The first study explored cognitive, motor initiation, and mobility predictors of unsafe pedestrian crossing behaviours. The purpose of the first study (Chapter 2) was to determine whether collisions at the near-side and far-side would be differentially predicted by mobility indices (such as walking speed and postural sway), motor initiation, and cognitive function (including spatial planning, visual attention, and within participant variability) with increasing age. The results suggest that near-side unsafe pedestrian crossing errors are related to processing speed, whereas far-side errors are related to spatial planning difficulties. Both near-side and far-side crossing errors were related to walking speed and motor initiation measures (specifically motor initiation variability). The salient mobility predictors of unsafe pedestrian crossings determined in the above study were examined in Chapter 3 in conjunction with the presence of a history of falls. The purpose of this study was to determine the extent to which walking speed (indicated as a salient predictor of unsafe crossings and start-up delay in Chapter 2), and previous falls can be predicted and explained by age-related changes in mobility and cognitive function changes (specifically within participant variability and spatial ability). 53.2% of walking speed variance was found to be predicted by self-rated mobility score, sit-to-stand time, motor initiation, and within participant variability. Although a significant model was not found to predict fall history variance, postural sway and attentional set shifting ability was found to be strongly related to the occurrence of falls within the last year. Next in Chapter 4, unsafe pedestrian crossing behaviour and pedestrian predictors (both mobility and cognitive measures) from Chapter 2 were explored in terms of increasing hemispheric laterality of attentional functions and inter-hemispheric oscillatory beta power changes associated with increasing age. Elevated beta (15-35 Hz) power in the motor cortex prior to movement, and reduced beta power post-movement has been linked to age-related changes in mobility. In addition, increasing recruitment of both hemispheres has been shown to occur and be beneficial to perform similarly to younger adults in cognitive tasks (Cabeza, Anderson, Locantore, & McIntosh, 2002). It has been hypothesised that changes in hemispheric neural beta power may explain the presence of more pedestrian errors at the farside of the road in older adults. The purpose of the study was to determine whether changes in age-related cortical oscillatory beta power and hemispheric laterality are linked to unsafe pedestrian behaviour in older adults. Results indicated that pedestrian errors at the near-side are linked to hemispheric bilateralisation, and neural overcompensation post-movement, 4 whereas far-side unsafe errors are linked to not employing neural compensation methods (hemispheric bilateralisation). Finally, in Chapter 5, fear of falling, life space mobility, and quality of life in old age were examined to determine their relationships with cognition, mobility (including fall history and pedestrian behaviour), and motor initiation. In addition to death and injury, mobility decline (such as pedestrian errors in Chapter 2, and falls in Chapter 3) and cognition can negatively affect quality of life and result in activity avoidance. Further, number of falls in Chapter 3 was not significantly linked to mobility and cognition alone, and may be further explained by a fear of falling. The objective of the above study (Study 2, Chapter 3) was to determine the role of mobility and cognition on fear of falling and life space mobility, and the impact on quality of life measures. Results indicated that missing safe pedestrian crossing gaps (potentially indicating crossing anxiety) and mobility decline were consistent predictors of fear of falling, reduced life space mobility, and quality of life variance. Social community (total number of close family and friends) was also linked to life space mobility and quality of life. Lower cognitive functions (particularly processing speed and reaction time) were found to predict variance in fear of falling and quality of life in old age. Overall, the findings indicated that mobility decline (particularly walking speed or walking difficulty), processing speed, and intra-individual variability in attention (including motor initiation variability) are salient predictors of participant safety (mainly pedestrian crossing errors) and wellbeing with increasing age. More research is required to produce a significant model to explain the number of falls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied-- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton--are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy--executive orders--redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deception research has traditionally focused on three methods of identifying liars and truth tellers: observing non-verbal or behavioral cues, analyzing verbal cues, and monitoring changes in physiological arousal during polygraph tests. Research shows that observers are often incapable of discriminating between liars and truth tellers with better than chance accuracy when they use these methods. One possible explanation for observers' poor performance is that they are not properly applying existing lie detection methods. An alternative explanation is that the cues on which these methods — and observers' judgments — are based do not reliably discriminate between liars and truth tellers. It may be possible to identify more reliable cues, and potentially improve observers' ability to discriminate, by developing a better understanding of how liars and truth tellers try to tell a convincing story. ^ This research examined (a) the verbal strategies used by truthful and deceptive individuals during interviews concerning an assigned activity, and (b) observers' ability to discriminate between them based on their verbal strategies. In Experiment I, pre-interview instructions manipulated participants' expectations regarding verifiability; each participant was led to believe that the interviewer could check some types of details, but not others, before deciding whether the participant was being truthful or deceptive. Interviews were then transcribed and scored for quantity and type of information provided. In Experiment II, observers listened to a random sample of the Experiment I interviews and rendered veracity judgments; half of the observers were instructed to judge the interviews according to the verbal strategies used by liars and truth tellers and the other half were uninstructed. ^ Results of Experiment I indicate that liars and truth tellers use different verbal strategies, characterized by a differential amount of detail. Overall, truthful participants provided more information than deceptive participants. This effect was moderated by participants' expectations regarding verifiability such that truthful participants provided more information only with regard to verifiable details. Results of Experiment II indicate that observers instructed about liars' and truth tellers' verbal strategies identify them with greater accuracy than uninstructed observers. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glycogen Synthase Kinase 3 (GSK3), a serine/threonine kinase initially characterized in the context of glycogen metabolism, has been repeatedly realized as a multitasking protein that can regulate numerous cellular events in both metazoa and protozoa. I recently found GSK3 plays a role in regulating chemotaxis, a guided cell movement in response to an external chemical gradient, in one of the best studied model systems for chemotaxis - Dictyostelium discoideum. ^ It was initially found that comparing to wild type cells, gsk3 - cells showed aberrant chemotaxis with a significant decrease in both speed and chemotactic indices. In Dictyostelium, phosphatidylinositol 3,4,5-triphosphate (PIP3) signaling is one of the best characterized pathways that regulate chemotaxis. Molecular analysis uncovered that gsk3- cells suffer from high basal level of PIP3, the product of PI3K. Upon chemoattractant cAMP stimulation, wild type cells displayed a transient increase in the level of PIP3. In contrast, gsk3- cells exhibited neither significant increase nor adaptation. On the other hand, no aberrant dynamic of phosphatase and tensin homolog (PTEN), which antagonizes PI3K function, was observed. Upon membrane localization of PI3K, PI3K become activated by Ras, which will in turn further facilitate membrane localization of PI3K in an F-Actin dependent manner. The gsk3- cells treated with F-Actin inhibitor Latrunculin-A showed no significant difference in the PIP3 level. ^ I also showed GSK3 affected the phosphorylation level of the localization domain of PI3K1 (PI3K1-LD). PI3K1-LD proteins from gsk3- cells displayed less phosphorylation on serine residues compared to that from wild type cells. When the potential GSK3 phosphorylation sites of PI3K1-LD were substituted with aspartic acids (Phosphomimetic substitution), its membrane localization was suppressed in gsk3- cells. When these serine residues of PI3K1-LD were substituted with alanine, aberrantly high level of membrane localization of the PI3K1-LD was monitored in wild type cells. Wild type, phosphomimetic, and alanine substitution of PI3K1-LD fused with GFP proteins also displayed identical localization behavior as suggested by the cell fraction studies. Lastly, I identified that all three potential GSK3 phosphorylation sites on PI3K1-LD could be phosphorylated in vitro by GSK3.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. ^ Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. ^ Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building's energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. ^ In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. ^ An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indigenous movements have become increasingly powerful in the last couple of decades and they are now important political actors in some South American countries, such as Bolivia, Ecuador, and, to a lesser extent, Peru and Chile. The rise of indigenous movements has provoked concern among U.S. policymakers and other observers who have feared that these movements will exacerbate ethnic polarization, undermine democracy, and jeopardize U.S. interests in the region. This paper argues that concern over the rise of indigenous movements is greatly exaggerated. It maintains that the rise of indigenous movements has not brought about a market increase in ethnic polarization in the region because most indigenous organizations have been ethnically inclusive and have eschewed violence. Although the indigenous movements have at times demonstrated a lack of regard for democratic institutions and procedures, they have also helped deepen democracy in the Andean region by promoting greater political inclusion and participation and by aggressively combating ethnic discrimination and inequality. Finally, this study suggests that the indigenous population has opposed some U.S. –sponsored initiatives, such as coca eradication and market reform, for legitimate reasons. Such policies have had some negative environmental, cultural, and economic consequences for indigenous people, which U.S. policymakers should try to address. The conclusion provides some specific policy recommendations on how to go about this.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As long as governmental institutions have existed, efforts have been undertaken to reform them. This research examines a particular strategy, coercive controls, exercised through a particular instrument, executive orders, by a singular reformer, the president of the United States. The presidents studied- Johnson, Nixon, Ford, Carter, Reagan, Bush, and Clinton-are those whose campaigns for office were characterized to varying degrees as against Washington bureaucracy and for executive reform. Executive order issuance is assessed through an examination of key factors for each president including political party affiliation, levels of political capital, and legislative experience. A classification typology is used to identify the topical dimensions and levels of coerciveness. The portrayal of the federal government is analyzed through examination of public, media, and presidential attention. The results show that executive orders are significant management tools for the president. Executive orders also represent an important component of the transition plans for incoming administrations. The findings indicate that overall, while executive orders have not increased in the aggregate, they are more intrusive and significant. When the factors of political party affiliation, political capital, and legislative experience are examined, it reveals a strong relationship between executive orders and previous executive experience, specifically presidents who served as a state governor prior to winning national election as president. Presidents Carter, Reagan, and Clinton (all former governors) have the highest percent of executive orders focusing on the federal bureaucracy. Additionally, the highest percent of forceful orders were issued by former governors (41.0%) as compared to their presidential counterparts who have not served as governors (19.9%). Secondly, political party affiliation is an important, but not significant, predictor for the use of executive orders. Thirdly, management strategies that provide the president with the greatest level of autonomy-executive orders redefine the concept of presidential power and autonomous action. Interviews of elite government officials and political observers support the idea that executive orders can provide the president with a successful management strategy, requiring less expenditure of political resources, less risk to political capital, and a way of achieving objectives without depending on an unresponsive Congress.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variable reluctance motors have been increasingly used as an alternative for variable speed and high speed drives in many industrial applications, due to many advantages like the simplicity of construction, robustness, and low cost. The most common applications in recent years are related to aeronautics, electric and hybrid vehicles and wind power generation. This paper explores the theory, operation, design procedures and analysis of a variable reluctance machine. An iterative design methodology is introduced and used to design a 1.25 kW prototype. For the analysis of the machine two methods are used, an analytical method and the finite element simulation. The results obtained by both methods are compared. The results of finite element simulation are used to determine the inductance profiles and torque of the prototype. The magnetic saturation is examined visually and numerically in four critical points of the machine. The data collected in the simulation allow the verification of design and operating limits for the prototype. Moreover, the behavior of the output quantities is analyzed (inductance, torque and magnetic saturation) by variation of physical dimensions of the motor. Finally, a multiobjective optimization using Differential Evolution algorithms and Genetic Algorithms for switched reluctance machine design is proposed. The optimized variables are rotor and stator polar arcs, and the goals are to maximize the average torque, the average torque per copper losses and the average torque per core volume. Finally, the initial design and optimized design are compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface finish is one of the most relevant aspects of machining operations, since it is one of the principle methods to assess quality. Also, surface finish influences mechanical properties such as fatigue behavior, wear, corrosion, etc. The feed, the cutting speed, the cutting tool material, the workpiece material and the cutting tool wear are some of the most important factors that affects the surface roughness of the machined surface. Due to the importance of the martensitic 416 stainless steel in the petroleum industry, especially in valve parts and pump shafts, this material was selected to study the influence of the feed per tooth and cutting speed on tool wear and surface integrity. Also the influence of tool wear on surface roughness is analyzed. Results showed that high values of roughness are obtained when using low cutting speed and feed per tooth and by using these conditions tool wear decreases prolonging tool life. Copyright © 2009 by ASME.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improvements in genomic technology, both in the increased speed and reduced cost of sequencing, have expanded the appreciation of the abundance of human genetic variation. However the sheer amount of variation, as well as the varying type and genomic content of variation, poses a challenge in understanding the clinical consequence of a single mutation. This work uses several methodologies to interpret the observed variation in the human genome, and presents novel strategies for the prediction of allele pathogenicity.

Using the zebrafish model system as an in vivo assay of allele function, we identified a novel driver of Bardet-Biedl Syndrome (BBS) in CEP76. A combination of targeted sequencing of 785 cilia-associated genes in a cohort of BBS patients and subsequent in vivo functional assays recapitulating the human phenotype gave strong evidence for the role of CEP76 mutations in the pathology of an affected family. This portion of the work demonstrated the necessity of functional testing in validating disease-associated mutations, and added to the catalogue of known BBS disease genes.

Further study into the role of copy-number variations (CNVs) in a cohort of BBS patients showed the significant contribution of CNVs to disease pathology. Using high-density array comparative genomic hybridization (aCGH) we were able to identify pathogenic CNVs as small as several hundred bp. Dissection of constituent gene and in vivo experiments investigating epistatic interactions between affected genes allowed for an appreciation of several paradigms by which CNVs can contribute to disease. This study revealed that the contribution of CNVs to disease in BBS patients is much higher than previously expected, and demonstrated the necessity of consideration of CNV contribution in future (and retrospective) investigations of human genetic disease.

Finally, we used a combination of comparative genomics and in vivo complementation assays to identify second-site compensatory modification of pathogenic alleles. These pathogenic alleles, which are found compensated in other species (termed compensated pathogenic deviations [CPDs]), represent a significant fraction (from 3 – 10%) of human disease-associated alleles. In silico pathogenicity prediction algorithms, a valuable method of allele prioritization, often misrepresent these alleles as benign, leading to omission of possibly informative variants in studies of human genetic disease. We created a mathematical model that was able to predict CPDs and putative compensatory sites, and functionally showed in vivo that second-site mutation can mitigate the pathogenicity of disease alleles. Additionally, we made publically available an in silico module for the prediction of CPDs and modifier sites.

These studies have advanced the ability to interpret the pathogenicity of multiple types of human variation, as well as made available tools for others to do so as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much of what is known about word recognition in toddlers comes from eyetracking studies. Here we show that the speed and facility with which children recognize words, as revealed in such studies, cannot be attributed to a task-specific, closed-set strategy; rather, children's gaze to referents of spoken nouns reflects successful search of the lexicon. Toddlers' spoken word comprehension was examined in the context of pictures that had two possible names (such as a cup of juice which could be called "cup" or "juice") and pictures that had only one likely name for toddlers (such as "apple"), using a visual world eye-tracking task and a picture-labeling task (n = 77, mean age, 21 months). Toddlers were just as fast and accurate in fixating named pictures with two likely names as pictures with one. If toddlers do name pictures to themselves, the name provides no apparent benefit in word recognition, because there is no cost to understanding an alternative lexical construal of the picture. In toddlers, as in adults, spoken words rapidly evoke their referents.