889 resultados para Twin coronet porphyrins
Resumo:
Ovarian follicle development continues in a wave-like manner during the bovine oestrous cycle giving rise to variation in the duration of ovulatory follicle development. The objectives of the present study were to determine whether a relationship exists between the duration of ovulatory follicle development and pregnancy rates following artificial insemination (AI) in dairy cows undergoing spontaneous oestrous cycles, and to identify factors influencing follicle turnover and pregnancy rate and the relationship between these two variables. Follicle development was monitored by daily transrectal ultrasonography from 10 days after oestrus until the subsequent oestrus in 158 lactating dairy cows. The cows were artificially inseminated following the second observed oestrus and pregnancy was diagnosed 35 days later. The predominant pattern of follicle development was two follicle waves (74.7%) with three follicle waves in 22.1% of oestrous cycles and four or more follicle waves in 3.2% of oestrous cycles. The interval from ovulatory follicle emergence to oestrus (EOI) was 3 days longer (P < 0.0001) in cows with two follicle waves than in those with three waves. Ovulatory follicles from two-wave oestrous cycles grew more slowly but were approximately 2 mm larger (P < 0.0001) on the day of oestrus. Twin ovulations were observed in 14.2% of oestrous cycles and occurred more frequently (P < 0.001) in three-wave oestrous cycles; consequently EOI was shorter in cows with twin ovulations. Overall, 57.0% of the cows were diagnosed pregnant 35 days after AI. Linear logistic regression analysis revealed an inverse relationship between EOI and the proportion of cows diagnosed pregnant, among all cows (n = 158; P < 0.01) and amongst those with single ovulations (n = 145; P < 0.05). Mean EOI was approximately I day shorter (P < 0.01) in cows that became pregnant than in non-pregnant cows; however, pregnancy rates did not differ significantly among cows with different patterns of follicle development. These findings confirm and extend previous observations in pharmacologically manipulated cattle and show, for the first time, that in dairy cows undergoing spontaneous oestrous cycles, natural variation in the duration of post-emergence ovulatory follicle development has a significant effect on pregnancy rate, presumably reflecting variation in oocyte developmental competence.
Resumo:
The large-scale production of clean energy is one of the major challenges society is currently facing. Molecular hydrogen is envisaged as a key green fuel for the future, but it becomes a sustainable alternative for classical fuels only if it is also produced in a clean fashion. Here, we report a supramolecular biomimetic approach to form a catalyst that produces molecular hydrogen using light as the energy source. It is composed of an assembly of chromophores to a bis(thiolate)-bridged diiron ([2Fe2S]) based hydrogenase catalyst. The supramolecular building block approach introduced in this article enabled the easy formation of a series of complexes, which are all thoroughly characterized, revealing that the photoactivity of the catalyst assembly strongly depends on its nature. The active species, formed from different complexes, appears to be the [Fe-2(mu-pdt)(CO)(4){PPh2(4-py)}(2)] (3) with 2 different types of porphyrins (5a and 5b) coordinated to it. The modular supramolecular approach was important in this study as with a limited number of building blocks several different complexes were generated.
Resumo:
We present stereoscopic images of an Earth-impacting Coronal Mass Ejection (CME). The CME was imaged by the Heliospheric Imagers onboard the twin STEREO spacecraft during December 2008. The apparent acceleration of the CME is used to provide independent estimates of its speed and direction from the two spacecraft. Three distinct signatures within the CME were all found to be closely Earth-directed. At the time that the CME was predicted to pass the ACE spacecraft, in-situ observations contained a typical CME signature. At Earth, ground-based magnetometer observations showed a small but widespread sudden response to the compression of the geomagnetic cavity at CME impact. In this case, STEREO could have given warning of CME impact at least 24 hours in advance. These stereoscopic observations represent a significant milestone for the STEREO mission and have significant potential for improving operational space weather forecasting.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society
Resumo:
In this paper sequential importance sampling is used to assess the impact of observations on a ensemble prediction for the decadal path transitions of the Kuroshio Extension (KE). This particle filtering approach gives access to the probability density of the state vector, which allows us to determine the predictive power — an entropy based measure — of the ensemble prediction. The proposed set-up makes use of an ensemble that, at each time, samples the climatological probability distribution. Then, in a post-processing step, the impact of different sets of observations is measured by the increase in predictive power of the ensemble over the climatological signal during one-year. The method is applied in an identical-twin experiment for the Kuroshio Extension using a reduced-gravity shallow water model. We investigate the impact of assimilating velocity observations from different locations during the elongated and the contracted meandering state of the KE. Optimal observations location correspond to regions with strong potential vorticity gradients. For the elongated state the optimal location is in the first meander of the KE. During the contracted state of the KE it is located south of Japan, where the Kuroshio separates from the coast.
Resumo:
Cloud imagery is not currently used in numerical weather prediction (NWP) to extract the type of dynamical information that experienced forecasters have extracted subjectively for many years. For example, rapidly developing mid-latitude cyclones have characteristic signatures in the cloud imagery that are most fully appreciated from a sequence of images rather than from a single image. The Met Office is currently developing a technique to extract dynamical development information from satellite imagery using their full incremental 4D-Var (four-dimensional variational data assimilation) system. We investigate a simplified form of this technique in a fully nonlinear framework. We convert information on the vertical wind field, w(z), and profiles of temperature, T(z, t), and total water content, qt (z, t), as functions of height, z, and time, t, to a single brightness temperature by defining a 2D (vertical and time) variational assimilation testbed. The profiles of w, T and qt are updated using a simple vertical advection scheme. We define a basic cloud scheme to obtain the fractional cloud amount and, when combined with the temperature field, we convert this information into a brightness temperature, having developed a simple radiative transfer scheme. With the exception of some matrix inversion routines, all our code is developed from scratch. Throughout the development process we test all aspects of our 2D assimilation system, and then run identical twin experiments to try and recover information on the vertical velocity, from a sequence of observations of brightness temperature. This thesis contains a comprehensive description of our nonlinear models and assimilation system, and the first experimental results.
Resumo:
A first-of-a-kind, extended-term cloud aircraft campaign was conducted to obtain an in-situ statistical characterization of continental boundary-layer clouds needed to investigate cloud processes and refine retrieval algorithms. Coordinated by the Atmospheric Radiation Measurement (ARM) Aerial Facility (AAF), the Routine AAF Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign operated over the ARM Southern Great Plains (SGP) site from 22 January to 30 June 2009, collecting 260 h of data during 59 research flights. A comprehensive payload aboard the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter aircraft measured cloud microphysics, solar and thermal radiation, physical aerosol properties, and atmospheric state parameters. Proximity to the SGP's extensive complement of surface measurements provides ancillary data that supports modeling studies and facilitates evaluation of a variety of surface retrieval algorithms. The five-month duration enabled sampling a range of conditions associated with the seasonal transition from winter to summer. Although about two-thirds of the cloud flights occurred in May and June, boundary-layer cloud fields were sampled under a variety of environmental and aerosol conditions, with about 77% of the flights occurring in cumulus and stratocumulus. Preliminary analyses illustrate use of these data to analyze cloud-aerosol relationships, characterize the horizontal variability of cloud radiative impacts, and evaluate surface-based retrievals. We discuss how an extended-term campaign requires a simplified operating paradigm that is different from that used for typical, short-term, intensive aircraft field programs.
Resumo:
In this study, the authors evaluate the (El Niño–Southern Oscillation) ENSO–Asian monsoon interaction in a version of the Hadley Centre coupled ocean–atmosphere general circulation model (CGCM) known as HadCM3. The main focus is on two evolving anomalous anticyclones: one located over the south Indian Ocean (SIO) and the other over the western North Pacific (WNP). These two anomalous anticyclones are closely related to the developing and decaying phases of the ENSO and play a crucial role in linking the Asian monsoon to ENSO. It is found that the HadCM3 can well simulate the main features of the evolution of both anomalous anticyclones and the related SST dipoles, in association with the different phases of the ENSO cycle. By using the simulated results, the authors examine the relationship between the WNP/SIO anomalous anticyclones and the ENSO cycle, in particular the biennial component of the relationship. It is found that a strong El Niño event tends to be followed by a more rapid decay and is much more likely to become a La Niña event in the subsequent winter. The twin anomalous anticyclones in the western Pacific in the summer of a decaying El Niño are crucial for the transition from an El Niño into a La Niña. The El Niño (La Niña) events, especially the strong ones, strengthen significantly the correspondence between the SIO anticyclonic (cyclonic) anomaly in the preceding autumn and WNP anticyclonic (cyclonic) anomaly in the subsequent spring, and favor the persistence of the WNP anomaly from spring to summer. The present results suggest that both El Niño (La Niña) and the SIO/WNP anticyclonic (cyclonic) anomalies are closely tied with the tropospheric biennial oscillation (TBO). In addition, variability in the East Asian summer monsoon, which is dominated by the internal atmospheric variability, seems to be responsible for the appearance of the WNP anticyclonic anomaly through an upper-tropospheric meridional teleconnection pattern over the western and central Pacific.
Resumo:
The problem of spurious excitation of gravity waves in the context of four-dimensional data assimilation is investigated using a simple model of balanced dynamics. The model admits a chaotic vortical mode coupled to a comparatively fast gravity wave mode, and can be initialized such that the model evolves on a so-called slow manifold, where the fast motion is suppressed. Identical twin assimilation experiments are performed, comparing the extended and ensemble Kalman filters (EKF and EnKF, respectively). The EKF uses a tangent linear model (TLM) to estimate the evolution of forecast error statistics in time, whereas the EnKF uses the statistics of an ensemble of nonlinear model integrations. Specifically, the case is examined where the true state is balanced, but observation errors project onto all degrees of freedom, including the fast modes. It is shown that the EKF and EnKF will assimilate observations in a balanced way only if certain assumptions hold, and that, outside of ideal cases (i.e., with very frequent observations), dynamical balance can easily be lost in the assimilation. For the EKF, the repeated adjustment of the covariances by the assimilation of observations can easily unbalance the TLM, and destroy the assumptions on which balanced assimilation rests. It is shown that an important factor is the choice of initial forecast error covariance matrix. A balance-constrained EKF is described and compared to the standard EKF, and shown to offer significant improvement for observation frequencies where balance in the standard EKF is lost. The EnKF is advantageous in that balance in the error covariances relies only on a balanced forecast ensemble, and that the analysis step is an ensemble-mean operation. Numerical experiments show that the EnKF may be preferable to the EKF in terms of balance, though its validity is limited by ensemble size. It is also found that overobserving can lead to a more unbalanced forecast ensemble and thus to an unbalanced analysis.
Resumo:
Severe acute malnutrition is a major cause of child death in developing countries. In a recent study, Smith et al.(2013) monitored a large twin cohort in Malawi to unveil a causal relationship between gut microbiota and weight loss in undernutrition .
Resumo:
This thesis aims to investigate the development and functions of public libraries in Rome and the Roman world. After a preface with maps of libraries in Rome, Section I discusses the precursors for public library provision in the private book collections of Republican Rome, and their transfer into the public domain with the first public libraries of Asinius Pollio and Augustus. Section II contains three 'case studies' of public libraries' different roles. The Augustan library programme is used in Ch.II.l to examine the role of imperial public libraries in literary life and the connections between Rome's libraries and those of Alexandria. Chapter II.2 concentrates on the libraries of Trajan's Forum to explore the intersection of imperial public libraries and monumental public architecture. This chapter responds to an important recent article by arguing for the continued identification of the Forum's libraries with twin brick buildings at its northern end, and suggests a series of correspondences between these libraries and its other monumental components. The conclusions of this chapter are important when considering the public libraries of the wider empire, several of which seem to have been inspired by the Trajanic libraries. Chapter II.3 considers imperial public libraries and leisure by looking at the evidence for libraries within bath-house complexes, concluding that their presence there is consistent with the archaeological and epigraphic evidence and fits in well with what we know of the intellectual and cultural life of these structures. Section III examines various aspects of the practical function of Roman public libraries: their contents (books and archives), division into Latin and Greek sections, provisions for shelving and cataloguing, staff, usership, architectural form, decoration, and housing of works of art. The picture that emerges is of carefully designed and functional buildings intended to sustain public, monumental, and practical functions. Section IV uses a variety of texts to examine the way in which libraries were viewed and used. Ch. IV. 1 discusses the evidence for use of libraries by scholars and authors such as Gellius, Galen, Josephus, and Apuleius. Ch. IV.2 examines parallels between library collections and compendious encyclopaedic elements within Roman literature and considers how library collections came to be canon-forming institutions and vehicles for the expression of imperial approval or disapproval towards authors. The channels through which this imperial influence flowed are investigated in Ch. IV.3, which looks at the directors and staff of the public libraries of Rome. The final section (V) of the thesis concerns public libraries outside the city of Rome. Provincial libraries provide a useful case study in 'Romanisation': they reveal a range of influences and are shown to embody local, personal, and metropolitan imperial identities. There follows a brief conclusion, and a bibliography. There are also five appendices of numismatic and epigraphic material discussed in the text. This material has not been adequately or completely gathered elsewhere and is intended to assist the reader; where appropriate it includes illustrations, transcriptions, and translations.
Resumo:
It has been reported that the ability to solve syllogisms is highly g-loaded. In the present study, using a self-administered shortened version of a syllogism-solving test, the BAROCO Short, we examined whether robust findings generated by previous research regarding IQ scores were also applicable to BAROCO Short scores. Five syllogism-solving problems were included in a questionnaire as part of a postal survey conducted by the Keio Twin Research Center. Data were collected from 487 pairs of twins (1021 individuals) who were Japanese junior high or high school students (ages 13–18) and from 536 mothers and 431 fathers. Four findings related to IQ were replicated: 1) The mean level increased gradually during adolescence, stayed unchanged from the 30s to the early 50s, and subsequently declined after the late 50s. 2) The scores for both children and parents were predicted by the socioeconomic status of the family. 3) The genetic effect increased, although the shared environmental effect decreased during progression from adolescence to adulthood. 4) Children's scores were genetically correlated with school achievement. These findings further substantiate the close association between syllogistic reasoning ability and g.
Resumo:
A potential problem with Ensemble Kalman Filter is the implicit Gaussian assumption at analysis times. Here we explore the performance of a recently proposed fully nonlinear particle filter on a high-dimensional but simplified ocean model, in which the Gaussian assumption is not made. The model simulates the evolution of the vorticity field in time, described by the barotropic vorticity equation, in a highly nonlinear flow regime. While common knowledge is that particle filters are inefficient and need large numbers of model runs to avoid degeneracy, the newly developed particle filter needs only of the order of 10-100 particles on large scale problems. The crucial new ingredient is that the proposal density cannot only be used to ensure all particles end up in high-probability regions of state space as defined by the observations, but also to ensure that most of the particles have similar weights. Using identical twin experiments we found that the ensemble mean follows the truth reliably, and the difference from the truth is captured by the ensemble spread. A rank histogram is used to show that the truth run is indistinguishable from any of the particles, showing statistical consistency of the method.
Resumo:
We investigate the initialization of Northern-hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates significantly reduces assimilation error both in identical-twin experiments and when assimilating sea-ice observations, reducing the concentration error by a factor of four to six, and the thickness error by a factor of two. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that the strong dependence of thermodynamic ice growth on ice concentration necessitates an adjustment of mean ice thickness in the analysis update. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that proportional mean-thickness updates are superior to the other two methods considered and enable us to assimilate sea ice in a global climate model using simple Newtonian relaxation.
Resumo:
We investigate the initialisation of Northern Hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates leads to good assimilation performance for sea-ice concentration and thickness, both in identical-twin experiments and when assimilating sea-ice observations. The simulation of other Arctic surface fields in the coupled model is, however, not significantly improved by the assimilation. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that an adjustment of mean ice thickness in the analysis update is essential to arrive at plausible state estimates. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that assimilation with proportional mean-thickness updates outperforms the other two methods considered. The method described here is very simple to implement, and gives results that are sufficiently good to be used for initialising sea ice in a global climate model for seasonal to decadal predictions.