47 resultados para Practical Atlas of Ruminant and Camelid Reproductive Ultrasonography
Resumo:
Lifetime reproductive success in female insects is often egg- or time-limited. For instance in pro-ovigenic species, when oviposition sites are abundant, females may quickly become devoid of eggs. Conversely, in the absence of suitable oviposition sites, females may die before laying all of their eggs. In pollinating fig wasps (Hymenoptera: Agaonidae), each species has an obligate mutualism with its host fig tree species [Ficus spp. (Moraceae)]. These pro-ovigenic wasps oviposit in individual ovaries within the inflorescences of monoecious Ficus (syconia, or ‘figs’), which contain many flowers. Each female flower can thus become a seed or be converted into a wasp gall. The mystery is that the wasps never oviposit in all fig ovaries, even when a fig contains enough wasp females with enough eggs to do so. The failure of all wasps to translate all of their eggs into offspring clearly contributes to mutualism persistence, but the underlying causal mechanisms are unclear. We found in an undescribed Brazilian Pegoscapus wasp population that the lifetime reproductive success of lone foundresses was relatively unaffected by constraints on oviposition. The number of offspring produced by lone foundresses experimentally introduced into receptive figs was generally lower than the numbers of eggs carried, despite the fact that the wasps were able to lay all or most of their eggs. Because we excluded any effects of intraspecific competitors and parasitic non-pollinating wasps, our data suggest that some pollinators produce few offspring because some of their eggs or larvae are unviable or are victims of plant defences.
Resumo:
This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.
The impact of information and communications technology on commercial real estate in the new economy
Resumo:
Purpose – This paper seeks to critically review the conceptual frameworks that have been developed for assessing the impact of information and communications technology (ICT) on real estate. Design/methodology/approach – The research is based on a critical review of existing literature and draws from examples of previous empirical research in the field. Findings – The paper suggests that a “socio-technical framework” is more appropriate to examine ICT impact in real estate than other “deterministic” frameworks. Therefore, ICT is an important part of the new economy, but must be seen in the context of a number of other social and economic factors. Research limitations/implications – The research is based on a qualitative assessment of existing frameworks, and by using examples from commercial real estate, assesses the extent to which a “socio-technical” framework can aid understanding of ICT impact. Practical implications – The paper is important in highlighting a number of the main issues in conceptualising ICT impact in real estate and also critically examines the emergence of a new economy in the information society within the general context of real estate. The paper also highlights research gaps in the field. Originality/value – The paper deconstructs the myths of the “death of real estate” and “productivity increase means jobs loss”, in relation to office real estate. Finally, it examines some of the ways in which ICT is impacting on real estate and suggests the most important components for a future research agenda in the field of ICT and real estate impact, and will be of value to property investors, facilities managers, developers, financiers, and others.
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.
Resumo:
Urbanization related alterations to the surface energy balance impact urban warming (‘heat islands’), the growth of the boundary layer, and many other biophysical processes. Traditionally, in situ heat flux measures have been used to quantify such processes, but these typically represent only a small local-scale area within the heterogeneous urban environment. For this reason, remote sensing approaches are very attractive for elucidating more spatially representative information. Here we use hyperspectral imagery from a new airborne sensor, the Operative Modular Imaging Spectrometer (OMIS), along with a survey map and meteorological data, to derive the land cover information and surface parameters required to map spatial variations in turbulent sensible heat flux (QH). The results from two spatially-explicit flux retrieval methods which use contrasting approaches and, to a large degree, different input data are compared for a central urban area of Shanghai, China: (1) the Local-scale Urban Meteorological Parameterization Scheme (LUMPS) and (2) an Aerodynamic Resistance Method (ARM). Sensible heat fluxes are determined at the full 6 m spatial resolution of the OMIS sensor, and at lower resolutions via pixel aggregation and spatial averaging. At the 6 m spatial resolution, the sensible heat flux of rooftop dominated pixels exceeds that of roads, water and vegetated areas, with values peaking at ∼ 350 W m− 2, whilst the storage heat flux is greatest for road dominated pixels (peaking at around 420 W m− 2). We investigate the use of both OMIS-derived land surface temperatures made using a Temperature–Emissivity Separation (TES) approach, and land surface temperatures estimated from air temperature measures. Sensible heat flux differences from the two approaches over the entire 2 × 2 km study area are less than 30 W m− 2, suggesting that methods employing either strategy maybe practica1 when operated using low spatial resolution (e.g. 1 km) data. Due to the differing methodologies, direct comparisons between results obtained with the LUMPS and ARM methods are most sensibly made at reduced spatial scales. At 30 m spatial resolution, both approaches produce similar results, with the smallest difference being less than 15 W m− 2 in mean QH averaged over the entire study area. This is encouraging given the differing architecture and data requirements of the LUMPS and ARM methods. Furthermore, in terms of mean study QH, the results obtained by averaging the original 6 m spatial resolution LUMPS-derived QH values to 30 and 90 m spatial resolution are within ∼ 5 W m− 2 of those derived from averaging the original surface parameter maps prior to input into LUMPS, suggesting that that use of much lower spatial resolution spaceborne imagery data, for example from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is likely to be a practical solution for heat flux determination in urban areas.
Resumo:
The international response to SARS-CoV has produced an outstanding number of protein structures in a very short time. This review summarizes the findings of functional and structural studies including those derived from cryoelectron microscopy, small angle X-ray scattering, NMR spectroscopy, and X-ray crystallography, and incorporates bioinformatics predictions where no structural data is available. Structures that shed light on the function and biological roles of the proteins in viral replication and pathogenesis are highlighted. The high percentage of novel protein folds identified among SARS-CoV proteins is discussed.
Resumo:
Whilst common sense knowledge has been well researched in terms of intelligence and (in particular) artificial intelligence, specific, factual knowledge also plays a critical part in practice. When it comes to testing for intelligence, testing for factual knowledge is, in every-day life, frequently used as a front line tool. This paper presents new results which were the outcome of a series of practical Turing tests held on 23rd June 2012 at Bletchley Park, England. The focus of this paper is on the employment of specific knowledge testing by interrogators. Of interest are prejudiced assumptions made by interrogators as to what they believe should be widely known and subsequently the conclusions drawn if an entity does or does not appear to know a particular fact known to the interrogator. The paper is not at all about the performance of machines or hidden humans but rather the strategies based on assumptions of Turing test interrogators. Full, unedited transcripts from the tests are shown for the reader as working examples. As a result, it might be possible to draw critical conclusions with regard to the nature of human concepts of intelligence, in terms of the role played by specific, factual knowledge in our understanding of intelligence, whether this is exhibited by a human or a machine. This is specifically intended as a position paper, firstly by claiming that practicalising Turing's test is a useful exercise throwing light on how we humans think, and secondly, by taking a potentially controversial stance, because some interrogators adopt a solipsist questioning style of hidden entities with a view that it is a thinking intelligent human if it thinks like them and knows what they know. The paper is aimed at opening discussion with regard to the different aspects considered.
Resumo:
Background. Current models of concomitant, intermittent strabismus, heterophoria, convergence and accommodation anomalies are either theoretically complex or incomplete. We propose an alternative and more practical way to conceptualize clinical patterns. Methods. In each of three hypothetical scenarios (normal; high AC/A and low CA/C ratios; low AC/A and high CA/C ratios) there can be a disparity-biased or blur-biased “style”, despite identical ratios. We calculated a disparity bias index (DBI) to reflect these biases. We suggest how clinical patterns fit these scenarios and provide early objective data from small illustrative clinical groups. Results. Normal adults and children showed disparity bias (adult DBI 0.43 (95%CI 0.50-0.36), child DBI 0.20 (95%CI 0.31-0.07) (p=0.001). Accommodative esotropes showed less disparity-bias (DBI 0.03). In the high AC/A and low CA/C scenario, early presbyopes had mean DBI of 0.17 (95%CI 0.28-0.06), compared to DBI of -0.31 in convergence excess esotropes. In the low AC/A and high CA/C scenario near exotropes had mean DBI of 0.27, while we predict that non-strabismic, non-amblyopic hyperopes with good vision without spectacles will show lower DBIs. Disparity bias ranged between 1.25 and -1.67. Conclusions. Establishing disparity or blur bias, together with knowing whether convergence to target demand exceeds accommodation or vice versa explains clinical patterns more effectively than AC/A and CA/C ratios alone. Excessive bias or inflexibility in near-cue use increases risk of clinical problems. We suggest clinicians look carefully at details of accommodation and convergence changes induced by lenses, dissociation and prisms and use these to plan treatment in relation to the model.
Resumo:
Green supply chain management and environmental and ethical behaviour (EEB), a major component of corporate responsibility (CR), are rapidly developing fields in research and practice. The influence and effect of EEB at the functional level, however, is under-researched. Similarly, the management of risk in the supply chain has become a practical concern for many firms. It is important that managers have a good understanding of the risks associated with supplier partnerships. This paper examines the effect of firms’ investment in EEB as part of corporate social responsibility in mediating the relationship between supply chain partnership (SCP) and management appreciation of the risk of partnering. We hypothesise that simply entering into a SCP does not facilitate an appreciation of the risk of partnering and may even hamper such awareness. However, such an appreciation of the risk is facilitated through CR’s environmental and stakeholder management ethos. The study contributes further by separating risk into distinct relational and performance components. The results of a firm-level survey confirm the mediation effect, highlighting the value to supply chain strategy and design of investing in EEB on three fronts: building internal awareness, monitoring and sharing best practice.
Resumo:
Summary Reasons for performing study: Metabonomics is emerging as a powerful tool for disease screening and investigating mammalian metabolism. This study aims to create a metabolic framework by producing a preliminary reference guide for the normal equine metabolic milieu. Objectives: To metabolically profile plasma, urine and faecal water from healthy racehorses using high resolution 1H-NMR spectroscopy and to provide a list of dominant metabolites present in each biofluid for the benefit of future research in this area. Study design: This study was performed using seven Thoroughbreds in race training at a single time-point. Urine and faecal samples were collected non-invasively and plasma was obtained from samples taken for routine clinical chemistry purposes. Methods: Biofluids were analysed using 1H-NMR spectroscopy. Metabolite assignment was achieved via a range of 1D and 2D experiments. Results: A total of 102 metabolites were assigned across the three biological matrices. A core metabonome of 14 metabolites was ubiquitous across all biofluids. All biological matrices provided a unique window on different aspects of systematic metabolism. Urine was the most populated metabolite matrix with 65 identified metabolites, 39 of which were unique to this biological compartment. A number of these were related to gut microbial host co-metabolism. Faecal samples were the most metabolically variable between animals; acetate was responsible for the majority (28%) of this variation. Short chain fatty acids were the predominant features identified within this biofluid by 1H-NMR spectroscopy. Conclusions: Metabonomics provides a platform for investigating complex and dynamic interactions between the host and its consortium of gut microbes and has the potential to uncover markers for health and disease in a variety of biofluids. Inherent variation in faecal extracts along with the relative abundance of microbial-mammalian metabolites in urine and invasive nature of plasma sampling, infers that urine is the most appropriate biofluid for the purposes of metabonomic analysis.
Resumo:
Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.