861 resultados para development, hyperpolarization-activated current, Cajal-Retzius, subplate, cortical plate
Resumo:
The purpose of this study was to evaluate current materials given to parents of new hearing aid users.
Resumo:
Children may be at higher risk than adults from pesticide exposure, due to their rapidly developing physiology, unique behavioral patterns, and interactions with the physical environment. This preliminary study conducted in Ecuador examines the association between household and environmental risk factors for pesticide exposure and neurobehavioral development. We collected data over 6 months in the rural highland region of Cayambe, Ecuador (2003–2004). Children age 24–61 months residing in 3 communities were assessed with the Ages and Stages Questionnaire and the Visual Motor Integration Test. We gathered information on maternal health and work characteristics, the home and community environment, and child characteristics. Growth measurements and a hemoglobin finger-prick blood test were obtained. Multiple linear regression analyses were conducted. Current maternal employment in the flower industry was associated with better developmental scores. Longer hours playing outdoors were associated with lower gross and fine motor and problem solving skills. Children who played with irrigation water scored lower on fine motor skills (8% decrease; 95% confidence interval 9.31 to 0.53), problem-solving skills (7% decrease; 8.40 to 0.39), and Visual Motor Integration test scores (3% decrease; 12.00 to 1.08). These results suggest that certain environmental risk factors for exposure to pesticides may affect child development, with contact with irrigation water of particular concern. However, the relationships between these risk factors and social characteristics are complex, as corporate agriculture may increase risk through pesticide exposure and environmental contamination, while indirectly promoting healthy development by providing health care, relatively higher salaries, and daycare options.
Resumo:
The objectives of this paper are first, evaluating economic, social and environmental effects on oil extraction in Ecuador during the last 41 years, and second, discussing prospects to achieving a sustainable and equitable development path in the future, in the context of declining oil reserves. The current government is pursuing an extractivist policy, based on expanding oil extraction in formerly unexploited fields -including those inside the Yasuni National Park- and starting largescale mining exploitation. Two future options will be evaluated, first, an expansion of extractive activities, and second an alternative based on conservation, with sustainable use of natural resources (e.g. ecotourism, agroforestry, bio-knowledge), without expansion of oil field expansion and mining.
Resumo:
Birds are vulnerable to collisions with human-made fixed structures. Despite ongoing development and increases in infrastructure, we have few estimates of the magnitude of collision mortality. We reviewed the existing literature on avian mortality associated with transmission lines and derived an initial estimate for Canada. Estimating mortality from collisions with power lines is challenging due to the lack of studies, especially from sites within Canada, and due to uncertainty about the magnitude of detection biases. Detection of bird collisions with transmission lines varies due to habitat type, species size, and scavenging rates. In addition, birds can be crippled by the impact and subsequently die, although crippling rates are poorly known and rarely incorporated into estimates. We used existing data to derive a range of estimates of avian mortality associated with collisions with transmission lines in Canada by incorporating detection, scavenging, and crippling biases. There are 231,966 km of transmission lines across Canada, mostly in the boreal forest. Mortality estimates ranged from 1 million to 229.5 million birds per year, depending on the bias corrections applied. We consider our most realistic estimate, taking into account variation in risk across Canada, to range from 2.5 million to 25.6 million birds killed per year. Data from multiple studies across Canada and the northern U.S. indicate that the most vulnerable bird groups are (1) waterfowl, (2) grebes, (3) shorebirds, and (4) cranes, which is consistent with other studies. Populations of several groups that are vulnerable to collisions are increasing across Canada (e.g., waterfowl, raptors), which suggests that collision mortality, at current levels, is not limiting population growth. However, there may be impacts on other declining species, such as shorebirds and some species at risk, including Alberta’s Trumpeter Swans (Cygnus buccinator) and western Canada’s endangered Whooping Cranes (Grus americana). Collisions may be more common during migration, which underscores the need to understand impacts across the annual cycle. We emphasize that these estimates are preliminary, especially considering the absence of Canadian studies.
Resumo:
Mutations in several classes of embryonically-expressed transcription factor genes are associated with behavioral disorders and epilepsies. However, there is little known about how such genetic and neurodevelopmental defects lead to brain dysfunction. Here we present the characterization of an epilepsy syndrome caused by the absence of the transcription factor SOX1 in mice. In vivo electroencephalographic recordings from SOX1 mutants established a correlation between behavioral changes and cortical output that was consistent with a seizure origin in the limbic forebrain. In vitro intracellular recordings from three major forebrain regions, neocortex, hippocampus and olfactory (piriform) cortex (OC) showed that only the OC exhibits abnormal enhanced synaptic excitability and spontaneous epileptiform discharges. Furthermore, the hyperexcitability of the OC neurons was present in mutants prior to the onset of seizures but was completely absent from both the hippocampus and neocortex of the same animals. The local inhibitory GABAergic neurotransmission remained normal in the OC of SOX1-deficient brains, but there was a severe developmental deficit of OC postsynaptic target neurons, mainly GABAergic projection neurons within the olfactory tubercle and the nucleus accumbens shell. Our data show that SOX1 is essential for ventral telencephalic development and suggest that the neurodevelopmental defect disrupts local neuronal circuits leading to epilepsy in the SOX1-deficient mice
Resumo:
Metastatic malignant melanoma remains a highly aggressive form of skin cancer for which no reliable methods for treatment exist. Given the increasing incidence of this cancer, considerable attention has focused on the development of new and improved methods for tackling this disease. Within this article, methods for treating melanoma are reviewed and discussed with particular attention focusing on prodrugs that are activated by the tyrosinase enzyme. This enzyme is up-regulated and is of elevated activity within malignant melanomas compared with healthy melanocytes, providing an ideal in-situ tool for the activation of melanoma prodrugs. By way of background to the prodrug strategies discussed within this review, the causes of melanoma, the enzymology of tyrosinase, and the chemistry of the biosynthetic pathways associated with melanogenesis are presented. Aspects of the design, mode of action, and biological profiles of key prodrugs that are activated by tyrosinase, and that show potential for the treatment of melanoma, are then presented and compared.
Resumo:
Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Development geography has long sought to understand why inequalities exist and the best ways to address them. Dependency theory sets out an historical rationale for under development based on colonialism and a legacy of developed core and under-developed periphery. Race is relevant in this theory only insofar that Europeans are white and the places they colonised were occupied by people with darker skin colour. There are no innate biological reasons why it happened in that order. However, a new theory for national inequalities proposed by Lynn and Vanhanen in a series of publications makes the case that poorer countries have that status because of a poorer genetic stock rather than an accident of history. They argue that IQ has a genetic basis and IQ is linked to ability. Thus races with a poorer IQ have less ability, and thus national IQ can be positively correlated with performance as measured by an indicator like GDP/capita. Their thesis is one of despair, as little can be done to improve genetic stock significantly other than a programme of eugenics. This paper summarises and critiques the Lynn and Vanhanen hypothesis and the assumptions upon which it is based, and uses this analysis to show how a human desire to simplify in order to manage can be dangerous in development geography. While the attention may naturally be focused on the 'national IQ' variables as a proxy measure of 'innate ability', the assumption of GDP per capita as an indicator of 'success' and 'achievement' is far more readily accepted without criticism. The paper makes the case that the current vogue for indicators, indices and cause-effect can be tyrannical.
Resumo:
Genetically modified (GM) crops and sustainable development remain the foci of much media attention, especially given current concerns about a global food crisis. However, whilst the latter is embraced with enthusiasm by almost all groups, GM crops generate very mixed views. Some countries have welcomed GM, but others, notably those in Europe, adopt a cautious stance. This article aims to review the contribution that GM crops can make to agricultural sustainability in the developing world. Following brief reviews of both issues and their linkages, notably the pros and cons of GM cotton as a contributory factor in sustainability, a number of case studies from resourcepoor cotton farmers in Makhathini Flats, South Africa, is presented for a six-year period. Data on expenditure, productivity and income indicate that Bacillus thuringiensis (Bt) cotton is advantageous because it reduces costs, for example, of pesticides, and increases income, and the indications are that those benefits continued over at least the six years covered by the studies. There are repercussions of the additional income in the households; debts are reduced and money is invested in children's education and in the farms. However, in the general GM debate, the results show that GM crops are not miracle products which alleviate poverty at a stroke, but nor is there evidence that they will cause the scale of environmental damage associated with indiscriminate pesticide use. Indeed, for some GM antagonists, perhaps even the majority, such debates are irrelevant – the transfer of genes between species is unnatural and unethical. For them, GM crops will never be acceptable despite the evidence and pressure to increase world food production.
Resumo:
Partnerships are complex, diverse and subtle relationships, the nature of which changes with time, but they are vital for the functioning of the development chain. This paper reviews the meaning of partnership between development institutions as well as some of the main approaches taken to analyse the relationships. The latter typically revolve around analyses based on power, discourse, interdependence and functionality. The paper makes the case for taking a multianalytical approach to understanding partnership but points out three problem areas: identifying acceptable/unacceptable trade-offs between characteristics of partnership, the analysis of multicomponent partnerships (where one partner has a number of other partners) and the analysis of long-term partnership. The latter is especially problematic for long-term partnerships between donors and field agencies that share an underlying commitment based on religious beliefs. These problems with current methods of analysing partnership are highlighted by focusing upon the Catholic Church-based development chain, linking donors in the North (Europe) and their field partners in the South (Abuja Ecclesiastical Province, Nigeria). It explores a narrated history of a relationship with a single donor spanning 35 years from the perspective of one partner (the field agency).
Resumo:
The global monsoon system is so varied and complex that understanding and predicting its diverse behaviour remains a challenge that will occupy modellers for many years to come. Despite the difficult task ahead, an improved monsoon modelling capability has been realized through the inclusion of more detailed physics of the climate system and higher resolution in our numerical models. Perhaps the most crucial improvement to date has been the development of coupled ocean-atmosphere models. From subseasonal to interdecadal time scales, only through the inclusion of air-sea interaction can the proper phasing and teleconnections of convection be attained with respect to sea surface temperature variations. Even then, the response to slow variations in remote forcings (e.g., El Niño—Southern Oscillation) does not result in a robust solution, as there are a host of competing modes of variability that must be represented, including those that appear to be chaotic. Understanding the links between monsoons and land surface processes is not as mature as that explored regarding air-sea interactions. A land surface forcing signal appears to dominate the onset of wet season rainfall over the North American monsoon region, though the relative role of ocean versus land forcing remains a topic of investigation in all the monsoon systems. Also, improved forecasts have been made during periods in which additional sounding observations are available for data assimilation. Thus, there is untapped predictability that can only be attained through the development of a more comprehensive observing system for all monsoon regions. Additionally, improved parameterizations - for example, of convection, cloud, radiation, and boundary layer schemes as well as land surface processes - are essential to realize the full potential of monsoon predictability. A more comprehensive assessment is needed of the impact of black carbon aerosols, which may modulate that of other anthropogenic greenhouse gases. Dynamical considerations require ever increased horizontal resolution (probably to 0.5 degree or higher) in order to resolve many monsoon features including, but not limited to, the Mei-Yu/Baiu sudden onset and withdrawal, low-level jet orientation and variability, and orographic forced rainfall. Under anthropogenic climate change many competing factors complicate making robust projections of monsoon changes. Absent aerosol effects, increased land-sea temperature contrast suggests strengthened monsoon circulation due to climate change. However, increased aerosol emissions will reflect more solar radiation back to space, which may temper or even reduce the strength of monsoon circulations compared to the present day. Precipitation may behave independently from the circulation under warming conditions in which an increased atmospheric moisture loading, based purely on thermodynamic considerations, could result in increased monsoon rainfall under climate change. The challenge to improve model parameterizations and include more complex processes and feedbacks pushes computing resources to their limit, thus requiring continuous upgrades of computational infrastructure to ensure progress in understanding and predicting current and future behaviour of monsoons.
Resumo:
From April 2010, the General Pharmaceutical Council (GPhC) will be responsible for the statutory regulation of pharmacists and pharmacy technicians in Great Britain (GB).[1] All statutorily regulated health professionals will need to periodically demonstrate their fitness-to-practise through a process of revalidation.[2] One option being considered in GB is that continuing professional development (CPD) records will form a part of the evidence submitted for revalidation, similar to the system in New Zealand.[3] At present, pharmacy professionals must make a minimum of nine CPD entries per annum from 1 March 2009 using the Royal Pharmaceutical Society of Great Britain (RPSGB) CPD framework. Our aim was to explore the applicability of new revalidation standards within the current CPD framework. We also wanted to review the content of CPD portfolios to assess strengths and qualities and identify any information gaps for the purpose of revalidation.
Resumo:
Capillary electrophoresis (CE) offers the analyst a number of key advantages for the analysis of the components of foods. CE offers better resolution than, say, high-performance liquid chromatography (HPLC), and is more adept at the simultaneous separation of a number of components of different chemistries within a single matrix. In addition, CE requires less rigorous sample cleanup procedures than HPLC, while offering the same degree of automation. However, despite these advantages, CE remains under-utilized by food analysts. Therefore, this review consolidates and discusses the currently reported applications of CE that are relevant to the analysis of foods. Some discussion is also devoted to the development of these reported methods and to the advantages/disadvantages compared with the more usual methods for each particular analysis. It is the aim of this review to give practicing food analysts an overview of the current scope of CE.
Resumo:
In this paper we review the experimental development of agri-environment measures for use on grasslands. Sward structure has been shown to have a strong influence on birds' ability to forage in grasslands, but the effects of food abundance on foraging behaviour are poorly understood and this hinders development of grassland conservation measures. The experiments described have a dual purpose: to investigate the foraging ecology of birds on grasslands and to test candidate management measures. Most of the work featured focuses on increasing invertebrate food resources during the summer by increasing habitat heterogeneity. We also identify important gaps in the habitats provided by existing or experimental measures, where similar dual-purpose experiments are required.