957 resultados para GLOBALLY HYPERBOLIC SPACETIMES
Resumo:
Eczema prevalence rates among Irish infants are unreported, despite eczema being the most common inflammatory condition of infancy. Maternal and infant nutritional status including vitamin D and other fat-soluble vitamins as well as early infant feeding have been linked with eczema initiation and development. Therefore, early nutrition could be a potential modifiable risk factor. The objective of this thesis was to prospectively describe early infant feeding and complementary feeding practices, to evaluate infant vitamin D supplementation practice, to quantify cord serum 25-hydroxyvitamin D [25(OH)D] and propose reference intervals for vitamin D metabolites, to report eczema prevalence and explore the potential role of infant nutrition and eczema. These research needs were investigated through the Cork BASELINE (Babies After SCOPE: Evaluating the Longitudinal Impact with Neurological and Nutritional Endpoints) Birth Cohort Study (n 2137). This thesis was the first comprehensive report from the birth cohort, therefore it was important to describe the cohort sociodemographic profile. Although socio-demographic characteristics compared well with national data, there was an over-representation of educated mothers which may limit the generalizability of the results. From August 2008 through November 2011, comprehensive postnatal assessments were completed at day 2 and at 2, 6, 12 and 24 months. Breastfeeding rates were low, while complementary feeding practices were broadly compliant with national guidelines. The implementation of a national infant vitamin D supplementation policy had a major impact on supplementation practice. Low levels of serum 25(OH)D were universal among Irish neonates. Eczema is a complex and multifaceted disease, which is increasing globally. This was the first report of eczema prevalence data among Irish infants which compared with international reports. Given the high prevalence and considerable burden eczema has on the lives of sufferers, intensive research efforts to identify a cause and therapeutic strategies to prevent/reduce eczema was re-emphasized in this thesis.
Resumo:
Background and Aims: Caesarean section rates have increased in recent decades and the effects on subsequent pregnancy outcome are largely unknown. Prior research has hypothesised that Caesarean section delivery may lead to an increased risk of subsequent stillbirth, miscarriage, ectopic pregnancy and sub-fertility. Structure and Methods: Papers 1-3 are systematic reviews with meta-analyses. Papers 4-6 are findings from this thesis on the rate of subsequent stillbirth, miscarriage, ectopic pregnancy and live birth by mode of delivery. Results Systematic reviews and meta-analyses: A 23% increased odds of subsequent stillbirth; no increase in odds of subsequent ectopic pregnancy and a 10% reduction in the odds of subsequent live birth among women with a previous Caesarean section were found in the various meta-analyses. Danish cohorts: Results from the Danish Civil Registration System (CRS) cohort revealed a small increased rate of subsequent stillbirth and ectopic pregnancy among women with a primary Caesarean section, which remained in the analyses by type of Caesarean. No increased rate of miscarriage was found among women with a primary Caesarean section. In the CRS data, women with a primary Caesarean section had a significantly reduced rate of subsequent live birth particularly among women with primary elective and maternal-requested Caesarean sections. In the Aarhus Birth Cohort, overall the effect of mode of delivery on the rate and time to next live birth was minimal. Conclusions: Primary Caesarean section was associated with a small increased rate of stillbirth and ectopic pregnancy, which may be in part due to underlying medical conditions. No increased rate of miscarriage was found. A reduced rate of subsequent live birth was found among Caesarean section in the CRS data. In the smaller ABC cohort, a small reduction in rate of subsequent live birth was found among women with a primary Caesarean section and is most likely due to maternal choice rather than any ill effects of the Caesarean. The findings of this study, the largest and most comprehensive to date will be of significant interest to health care providers and women globally.
Resumo:
'The ecological emergency’ describes both our emergence into, and the way we relate within, a set of globally urgent circumstances, brought about through anthropogenic impact. I identify two phases to this emergency. Firstly, there is the anthropogenic impact itself, interpreted through various conceptual models. Secondly, however, is the increasingly entrenched commitment to divergent conceptual positions, that leads to a growing disparateness in attitudes, and a concurrent difficulty with finding any grounds for convergence in response. I begin by reviewing the environmental ethics literature in order to clarify which components of the implicit narratives and beliefs of different positions create the foundations for such disparateness of views. I identify the conceptual frameworks through which moral agency and human responsibility are viewed, and that justify an ethical response to the ecological emergency. In particular, I focus on Paul Taylor's thesis of 'respect for nature' as a framework for revising both the idea that we are ‘moral’ and the idea that we are ‘agents’ in this unique way, and I open to question the idea that any response to the ecological emergency need be couched in ethical terms. This revision leads me to formulate an alternative conceptual model that makes use of Timothy Morton’s idea of enmeshment. I propose that we dramatically revise our idea of moral agency using the idea of enmeshment as a starting point. I develop an alternative framework that locates our capacity for responsibility within our capacity for realisation, both in the sense of understanding, and of making real, sets of conditions within our enmeshment. I draw parallels between this idea of ‘realisation as agency’ and the work of Dōgen and other non-dualists. I then propose a revised understanding of ‘the good’ of systems from a biophysical perspective, and compare this with certain features of Asian traditions of thought. I consider the practical implications of these revisions, and I conclude that the act of paying close attention, or realising, contains our agency, as does the attitude, or manner, with which we focus. This gives us the basis for a convergent response to the ecological emergency: the way of our engagement that is the key to responding to the ecological emergency
Resumo:
Introduction: Stroke is a chronic condition that significantly impacts on morbidity and mortality (Balanda et al. 2010). Globally, the complexity of stroke is well documented and more recently, in Ireland, as part of the National Survey of Stroke Survivors (Horgan et al. 2014). There are a number of factors that are known to influence adaptation post stroke. However, there is a lack of research to explain the variability in how survivors adapt post stroke. Hardiness is a broad personality trait that leads to better outcome. This study investigated the influence of hardiness and physical function on psychosocial adaptation post stroke. Methods: A quantitative cross-sectional, correlational, exploratory study was conducted between April and November 2013. The sample consisted of stroke survivors (n=100) who were recruited from three hospital outpatient departments and completed a questionnaire package. Results: The mean age of participants was 76 years (range 70-80), over half (56%) of the participants achieved the maximum score of 20 on the Barthel Index indicating independence in activities of daily living. The median number of days since stroke onset was 91 days (range 74-128). The total mean score and standard deviation for hardiness was 1.89 (0.4) as measured by the Dispositional Resilience Scale, indicating medium hardiness (possible range 0-3). Psychosocial adaptation was measured using the Psychosocial Adjustment to Illness Scale, the total weighted mean and standard deviation was 0.54 (0.3) indicating a satisfactory level of psychosocial adaptation (possible range 0-3). A hierarchical multiple linear regression was performed which contained 6 independent variables (hardiness, living arrangement, and length of hospital stay, number of days since stroke onset, physical function and self-rated recovery). Findings demonstrated that physical function (p<0.001) and hardiness (p=0.008) were significantly related to psychosocial adaptation. Altogether, 65% of the variation in psychosocial adaptation can be explained by the combined effect of the independent variables. Physical functioning had the highest unique contribution (11%) to explain the variance in psychosocial adaptation while self-rated recovery, hardiness, and living arrangements contributed 3% each. Conclusion: This research provides important information regarding factors that influence psychosocial adaptation post stroke at 3 months. Physical function significantly contributed to psychosocial adaptation post stroke. The personality trait of hardiness provides insight into how behaviour influenced adaptation post stroke. While hardiness also had a strong relationship with psychosocial adaptation, further research is necessary to fully comprehend this process.
Resumo:
Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.
Resumo:
This article examines the behavior of equity trading volume and volatility for the individual firms composing the Standard & Poor's 100 composite index. Using multivariate spectral methods, we find that fractionally integrated processes best describe the long-run temporal dependencies in both series. Consistent with a stylized mixture-of-distributions hypothesis model in which the aggregate "news"-arrival process possesses long-memory characteristics, the long-run hyperbolic decay rates appear to be common across each volume-volatility pair.
Resumo:
The research project takes place within the technology acceptability framework which tries to understand the use made of new technologies, and concentrates more specifically on the factors that influence multi-touch devices’ (MTD) acceptance and intention to use. Why be interested in MTD? Nowadays, this technology is used in all kinds of human activities, e.g. leisure, study or work activities (Rogowski and Saeed, 2012). However, the handling or the data entry by means of gestures on multi-touch-sensitive screen imposes a number of constraints and consequences which remain mostly unknown (Park and Han, 2013). Currently, few researches in ergonomic psychology wonder about the implications of these new human-computer interactions on task fulfillment.This research project aims to investigate the cognitive, sensori-motor and motivational processes taking place during the use of those devices. The project will analyze the influences of the use of gestures and the type of gesture used: simple or complex gestures (Lao, Heng, Zhang, Ling, and Wang, 2009), as well as the personal self-efficacy feeling in the use of MTD on task engagement, attention mechanisms and perceived disorientation (Chen, Linen, Yen, and Linn, 2011) when confronted to the use of MTD. For that purpose, the various above-mentioned concepts will be measured within a usability laboratory (U-Lab) with self-reported methods (questionnaires) and objective indicators (physiological indicators, eye tracking). Globally, the whole research aims to understand the processes at stakes, as well as advantages and inconveniences of this new technology, to favor a better compatibility and adequacy between gestures, executed tasks and MTD. The conclusions will allow some recommendations for the use of the DMT in specific contexts (e.g. learning context).
Resumo:
New burned area datasets and top-down constraints from atmospheric concentration measurements of pyrogenic gases have decreased the large uncertainty in fire emissions estimates. However, significant gaps remain in our understanding of the contribution of deforestation, savanna, forest, agricultural waste, and peat fires to total global fire emissions. Here we used a revised version of the Carnegie-Ames-Stanford-Approach (CASA) biogeochemical model and improved satellite-derived estimates of area burned, fire activity, and plant productivity to calculate fire emissions for the 1997-2009 period on a 0.5° spatial resolution with a monthly time step. For November 2000 onwards, estimates were based on burned area, active fire detections, and plant productivity from the MODerate resolution Imaging Spectroradiometer (MODIS) sensor. For the partitioning we focused on the MODIS era. We used maps of burned area derived from the Tropical Rainfall Measuring Mission (TRMM) Visible and Infrared Scanner (VIRS) and Along-Track Scanning Radiometer (ATSR) active fire data prior to MODIS (1997-2000) and estimates of plant productivity derived from Advanced Very High Resolution Radiometer (AVHRR) observations during the same period. Average global fire carbon emissions according to this version 3 of the Global Fire Emissions Database (GFED3) were 2.0 PgC year-1 with significant interannual variability during 1997-2001 (2.8 Pg Cyear-1 in 1998 and 1.6 PgC year-1 in 2001). Globally, emissions during 2002-2007 were rela-tively constant (around 2.1 Pg C year-1) before declining in 2008 (1.7 Pg Cyear-1) and 2009 (1.5 PgC year-1) partly due to lower deforestation fire emissions in South America and tropical Asia. On a regional basis, emissions were highly variable during 2002-2007 (e.g., boreal Asia, South America, and Indonesia), but these regional differences canceled out at a global level. During the MODIS era (2001-2009), most carbon emissions were from fires in grasslands and savannas (44%) with smaller contributions from tropical deforestation and degradation fires (20%), woodland fires (mostly confined to the tropics, 16%), forest fires (mostly in the extratropics, 15%), agricultural waste burning (3%), and tropical peat fires (3%). The contribution from agricultural waste fires was likely a lower bound because our approach for measuring burned area could not detect all of these relatively small fires. Total carbon emissions were on average 13% lower than in our previous (GFED2) work. For reduced trace gases such as CO and CH4, deforestation, degradation, and peat fires were more important contributors because of higher emissions of reduced trace gases per unit carbon combusted compared to savanna fires. Carbon emissions from tropical deforestation, degradation, and peatland fires were on average 0.5 PgC year-1. The carbon emissions from these fires may not be balanced by regrowth following fire. Our results provide the first global assessment of the contribution of different sources to total global fire emissions for the past decade, and supply the community with an improved 13-year fire emissions time series. © 2010 Author(s).
Resumo:
Exogenous gene delivery to alter the function of the heart is a potential novel therapeutic strategy for treatment of cardiovascular diseases such as heart failure (HF). Before gene therapy approaches to alter cardiac function can be realized, efficient and reproducible in vivo gene techniques must be established to efficiently transfer transgenes globally to the myocardium. We have been testing the hypothesis that genetic manipulation of the myocardial beta-adrenergic receptor (beta-AR) system, which is impaired in HF, can enhance cardiac function. We have delivered adenoviral transgenes, including the human beta2-AR (Adeno-beta2AR), to the myocardium of rabbits using an intracoronary approach. Catheter-mediated Adeno-beta2AR delivery produced diffuse multichamber myocardial expression, peaking 1 week after gene transfer. A total of 5 x 10(11) viral particles of Adeno-beta2AR reproducibly produced 5- to 10-fold beta-AR overexpression in the heart, which, at 7 and 21 days after delivery, resulted in increased in vivo hemodynamic function compared with control rabbits that received an empty adenovirus. Several physiological parameters, including dP/dtmax as a measure of contractility, were significantly enhanced basally and showed increased responsiveness to the beta-agonist isoproterenol. Our results demonstrate that global myocardial in vivo gene delivery is possible and that genetic manipulation of beta-AR density can result in enhanced cardiac performance. Thus, replacement of lost receptors seen in HF may represent novel inotropic therapy.
Resumo:
Confronting the rapidly increasing, worldwide reliance on biometric technologies to surveil, manage, and police human beings, my dissertation
Resumo:
Claims of injustice in global forest governance are prolific: assertions of colonization, marginalization and disenfranchisement of forest-dependent people, and privatization of common resources are some of the most severe allegations of injustice resulting from globally-driven forest conservation initiatives. At its core, the debate over the future of the world's forests is fraught with ethical concerns. Policy makers are not only deciding how forests should be governed, but also who will be winners, losers, and who should have a voice in the decision-making processes. For 30 years, policy makers have sought to redress the concerns of the world's 1.6 billion forest-dependent poor by introducing rights-based and participatory approaches to conservation. Despite these efforts, however, claims of injustice persist. This research examines possible explanations for continued claims of injustice by asking: What are the barriers to delivering justice to forest-dependent communities? Using data collected through surveys, interviews, and collaborative event ethnography in Laos and at the Tenth Conference of Parties to the Convention on Biological Diversity, this dissertation examines the pursuit of justice in global forest governance across multiple scales of governance. The findings reveal that particular conceptualizations of justice have become a central part of the metanormative fabric of global environmental governance, inhibiting institutional evolution and therewith perpetuating the justice gap in global forest governance.
Resumo:
Introduction: Traditional medicines are one of the most important means of achieving total health care coverage globally, and their importance in Tanzania extends beyond the impoverished rural areas. Their use remains high even in urban settings among the educated middle and upper classes. They are a critical component healthcare in Tanzania, but they also can have harmful side effects. Therefore we sought to understand the decision-making and reasoning processes by building an explanatory model for the use of traditional medicines in Tanzania.
Methods: We conducted a mixed-methods study between December 2013 and June 2014 in the Kilimanjaro Region of Tanzania. Using purposive sampling methods, we conducted focus group discussions (FGDs) and in-depth interviews of key informants, and the qualitative data were analyzed using an inductive Framework Method. A structured survey was created, piloted, and then administered it to a random sample of adults. We reported upon the reliability and validity of the structured survey, and we used triangulation from multiple sources to synthesize the qualitative and quantitative data.
Results: A total of five FGDs composed of 59 participants and 27 in-depth interviews were conducted in total. 16 of the in-depth interviews were with self-described traditional practitioners or herbal vendors. We identified five major thematic categories that relate to the decision to use traditional medicines in Kilimanjaro: healthcare delivery, disease understanding, credibility of the traditional practices, health status, and strong cultural beliefs.
A total of 473 participants (24.1% male) completed the structured survey. The most common reasons for taking traditional medicines were that they are more affordable (14%, 12.0-16.0), failure of hospital medicines (13%, 11.1-15.0), they work better (12%, 10.7-14.4), they are easier
to obtain (11%, 9.48-13.1), they are found naturally or free (8%, 6.56-9.68), hospital medicines have too many chemical (8%, 6.33-9.40), and they have fewer side effects (8%, 6.25-9.30). The most common uses of traditional medicines were for symptomatic conditions (42%), chronic diseases (14%), reproductive problems (11%), and malaria and febrile illnesses (10%). Participants currently taking hospital medicines for chronic conditions were nearly twice as likely to report traditional medicines usage in the past year (RR 1.97, p=0.05).
Conclusions: We built broad explanatory model for the use of traditional medicines in Kilimanjaro. The use of traditional medicines is not limited to rural or low socioeconomic populations and concurrent use of traditional medicines and biomedicine is high with frequent ethnomedical doctor shopping. Our model provides a working framework for understanding the complex interactions between biomedicine and traditional medicine. Future disease management and treatment programs will benefit from this understanding, and it can lead to synergistic policies with more effective implementation.
Resumo:
BACKGROUND: Phenotypic differences among species have long been systematically itemized and described by biologists in the process of investigating phylogenetic relationships and trait evolution. Traditionally, these descriptions have been expressed in natural language within the context of individual journal publications or monographs. As such, this rich store of phenotype data has been largely unavailable for statistical and computational comparisons across studies or integration with other biological knowledge. METHODOLOGY/PRINCIPAL FINDINGS: Here we describe Phenex, a platform-independent desktop application designed to facilitate efficient and consistent annotation of phenotypic similarities and differences using Entity-Quality syntax, drawing on terms from community ontologies for anatomical entities, phenotypic qualities, and taxonomic names. Phenex can be configured to load only those ontologies pertinent to a taxonomic group of interest. The graphical user interface was optimized for evolutionary biologists accustomed to working with lists of taxa, characters, character states, and character-by-taxon matrices. CONCLUSIONS/SIGNIFICANCE: Annotation of phenotypic data using ontologies and globally unique taxonomic identifiers will allow biologists to integrate phenotypic data from different organisms and studies, leveraging decades of work in systematics and comparative morphology.
Resumo:
The main conclusion of this dissertation is that global H2 production within young ocean crust (<10 Mya) is higher than currently recognized, in part because current estimates of H2 production accompanying the serpentinization of peridotite may be too low (Chapter 2) and in part because a number of abiogenic H2-producing processes have heretofore gone unquantified (Chapter 3). The importance of free H2 to a range of geochemical processes makes the quantitative understanding of H2 production advanced in this dissertation pertinent to an array of open research questions across the geosciences (e.g. the origin and evolution of life and the oxidation of the Earth’s atmosphere and oceans).
The first component of this dissertation (Chapter 2) examines H2 produced within young ocean crust [e.g. near the mid-ocean ridge (MOR)] by serpentinization. In the presence of water, olivine-rich rocks (peridotites) undergo serpentinization (hydration) at temperatures of up to ~500°C but only produce H2 at temperatures up to ~350°C. A simple analytical model is presented that mechanistically ties the process to seafloor spreading and explicitly accounts for the importance of temperature in H2 formation. The model suggests that H2 production increases with the rate of seafloor spreading and the net thickness of serpentinized peridotite (S-P) in a column of lithosphere. The model is applied globally to the MOR using conservative estimates for the net thickness of lithospheric S-P, our least certain model input. Despite the large uncertainties surrounding the amount of serpentinized peridotite within oceanic crust, conservative model parameters suggest a magnitude of H2 production (~1012 moles H2/y) that is larger than the most widely cited previous estimates (~1011 although previous estimates range from 1010-1012 moles H2/y). Certain model relationships are also consistent with what has been established through field studies, for example that the highest H2 fluxes (moles H2/km2 seafloor) are produced near slower-spreading ridges (<20 mm/y). Other modeled relationships are new and represent testable predictions. Principal among these is that about half of the H2 produced globally is produced off-axis beneath faster-spreading seafloor (>20 mm/y), a region where only one measurement of H2 has been made thus far and is ripe for future investigation.
In the second part of this dissertation (Chapter 3), I construct the first budget for free H2 in young ocean crust that quantifies and compares all currently recognized H2 sources and H2 sinks. First global estimates of budget components are proposed in instances where previous estimate(s) could not be located provided that the literature on that specific budget component was not too sparse to do so. Results suggest that the nine known H2 sources, listed in order of quantitative importance, are: Crystallization (6x1012 moles H2/y or 61% of total H2 production), serpentinization (2x1012 moles H2/y or 21%), magmatic degassing (7x1011 moles H2/y or 7%), lava-seawater interaction (5x1011 moles H2/y or 5%), low-temperature alteration of basalt (5x1011 moles H2/y or 5%), high-temperature alteration of basalt (3x1010 moles H2/y or <1%), catalysis (3x108 moles H2/y or <<1%), radiolysis (2x108 moles H2/y or <<1%), and pyrite formation (3x106 moles H2/y or <<1%). Next we consider two well-known H2 sinks, H2 lost to the ocean and H2 occluded within rock minerals, and our analysis suggests that both are of similar size (both are 6x1011 moles H2/y). Budgeting results suggest a large difference between H2 sources (total production = 1x1013 moles H2/y) and H2 sinks (total losses = 1x1011 moles H2/y). Assuming this large difference represents H2 consumed by microbes (total consumption = 9x1011 moles H2/y), we explore rates of primary production by the chemosynthetic, sub-seafloor biosphere. Although the numbers presented require further examination and future modifications, the analysis suggests that the sub-seafloor H2 budget is similar to the sub-seafloor CH4 budget in the sense that globally significant quantities of both of these reduced gases are produced beneath the seafloor but never escape the seafloor due to microbial consumption.
The third and final component of this dissertation (Chapter 4) explores the self-organization of barchan sand dune fields. In nature, barchan dunes typically exist as members of larger dune fields that display striking, enigmatic structures that cannot be readily explained by examining the dynamics at the scale of single dunes, or by appealing to patterns in external forcing. To explore the possibility that observed structures emerge spontaneously as a collective result of many dunes interacting with each other, we built a numerical model that treats barchans as discrete entities that interact with one another according to simplified rules derived from theoretical and numerical work, and from field observations: Dunes exchange sand through the fluxes that leak from the downwind side of each dune and are captured on their upstream sides; when dunes become sufficiently large, small dunes are born on their downwind sides (“calving”); and when dunes collide directly enough, they merge. Results show that these relatively simple interactions provide potential explanations for a range of field-scale phenomena including isolated patches of dunes and heterogeneous arrangements of similarly sized dunes in denser fields. The results also suggest that (1) dune field characteristics depend on the sand flux fed into the upwind boundary, although (2) moving downwind, the system approaches a common attracting state in which the memory of the upwind conditions vanishes. This work supports the hypothesis that calving exerts a first order control on field-scale phenomena; it prevents individual dunes from growing without bound, as single-dune analyses suggest, and allows the formation of roughly realistic, persistent dune field patterns.
Resumo:
Idioms of distress communicate suffering via reference to shared ethnopsychologies, and better understanding of idioms of distress can contribute to effective clinical and public health communication. This systematic review is a qualitative synthesis of "thinking too much" idioms globally, to determine their applicability and variability across cultures. We searched eight databases and retained publications if they included empirical quantitative, qualitative, or mixed-methods research regarding a "thinking too much" idiom and were in English. In total, 138 publications from 1979 to 2014 met inclusion criteria. We examined the descriptive epidemiology, phenomenology, etiology, and course of "thinking too much" idioms and compared them to psychiatric constructs. "Thinking too much" idioms typically reference ruminative, intrusive, and anxious thoughts and result in a range of perceived complications, physical and mental illnesses, or even death. These idioms appear to have variable overlap with common psychiatric constructs, including depression, anxiety, and PTSD. However, "thinking too much" idioms reflect aspects of experience, distress, and social positioning not captured by psychiatric diagnoses and often show wide within-cultural variation, in addition to between-cultural differences. Taken together, these findings suggest that "thinking too much" should not be interpreted as a gloss for psychiatric disorder nor assumed to be a unitary symptom or syndrome within a culture. We suggest five key ways in which engagement with "thinking too much" idioms can improve global mental health research and interventions: it (1) incorporates a key idiom of distress into measurement and screening to improve validity of efforts at identifying those in need of services and tracking treatment outcomes; (2) facilitates exploration of ethnopsychology in order to bolster cultural appropriateness of interventions; (3) strengthens public health communication to encourage engagement in treatment; (4) reduces stigma by enhancing understanding, promoting treatment-seeking, and avoiding unintentionally contributing to stigmatization; and (5) identifies a key locally salient treatment target.