969 resultados para Steven Moll
Resumo:
This paper describes experiments relating to the perception of the roughness of simulated surfaces via the haptic and visual senses. Subjects used a magnitude estimation technique to judge the roughness of “virtual gratings” presented via a PHANToM haptic interface device, and a standard visual display unit. It was shown that under haptic perception, subjects tended to perceive roughness as decreasing with increased grating period, though this relationship was not always statistically significant. Under visual exploration, the exact relationship between spatial period and perceived roughness was less well defined, though linear regressions provided a reliable approximation to individual subjects’ estimates.
Resumo:
In this paper we set out what we consider to be a set of best practices for statisticians in the reporting of pharmaceutical industry-sponsored clinical trials. We make eight recommendations covering: author responsibilities and recognition; publication timing; conflicts of interest; freedom to act; full author access to data; trial registration and independent review. These recommendations are made in the context of the prominent role played by statisticians in the design, conduct, analysis and reporting of pharmaceutical sponsored trials and the perception of the reporting of these trials in the wider community.
Resumo:
Concerns about potentially misleading reporting of pharmaceutical industry research have surfaced many times. The potential for duality (and thereby conflict) of interest is only too clear when you consider the sums of money required for the discovery, development and commercialization of new medicines. As the ability of major, mid-size and small pharmaceutical companies to innovate has waned, as evidenced by the seemingly relentless decline in the numbers of new medicines approved by Food and Drug Administration and European Medicines Agency year-on-year, not only has the cost per new approved medicine risen: so too has the public and media concern about the extent to which the pharmaceutical industry is open and honest about the efficacy, safety and quality of the drugs we manufacture and sell. In 2005 an Editorial in Journal of the American Medical Association made clear that, so great was their concern about misleading reporting of industry-sponsored studies, henceforth no article would be published that was not also guaranteed by independent statistical analysis. We examine the precursors to this Editorial, as well as its immediate and lasting effects for statisticians, for the manner in which statistical analysis is carried out, and for the industry more generally.
Resumo:
Texture and small-scale surface details are widely recognised as playing an important role in the haptic identification of objects. In order to simulate realistic textures in haptic virtual environments, it has become increasingly necessary to identify a robust technique for modelling of surface profiles. This paper describes a method whereby Fourier series spectral analysis is employed in order to describe the measured surface profiles of several characteristic surfaces. The results presented suggest that a bandlimited Fourier series can be used to provide a realistic approximation to surface amplitude profiles.
Resumo:
Purpose – The purpose of this paper is to consider prospects for UK REITs, which were introduced on 1 January 2007. It specifically focuses on the potential influence of depreciation and expenditure on income and distributions. Design/methodology/approach – First, the ways in which depreciation can affect vehicle earnings and value are discussed. This is then set in the context of the specific rules and features of REITs. An analysis using property income and expenditure data from the Investment Property Databank (IPD) then assesses what gross and net income for a UK REIT might have been like for the period 1984-2003. Findings – A UK REIT must distribute at least 90 per cent of net income from its property rental business. Expenditure therefore plays a significant part in determining what funds remain for distribution. Over 1984-2003, expenditure has absorbed 20 per cent of gross income and been a source of earnings volatility, which would have been exacerbated by gearing. Practical implications – Expenditure must take place to help UK REITs maintain and renew their real estate portfolios. In view of this, investors should moderate expectations of a high and stable income return, although it may well still be so relative to alternative investments. Originality/value – Previous literature on depreciation has not quantified amounts spent on portfolios to keep depreciation at those rates. Nor, to our knowledge, has its ideas been placed in the indirect investor context.
Resumo:
Investment risk models with infinite variance provide a better description of distributions of individual property returns in the IPD UK database over the period 1981 to 2003 than normally distributed risk models. This finding mirrors results in the US and Australia using identical methodology. Real estate investment risk is heteroskedastic, but the characteristic exponent of the investment risk function is constant across time – yet it may vary by property type. Asset diversification is far less effective at reducing the impact of non‐systematic investment risk on real estate portfolios than in the case of assets with normally distributed investment risk. The results, therefore, indicate that multi‐risk factor portfolio allocation models based on measures of investment codependence from finite‐variance statistics are ineffective in the real estate context
Resumo:
Purpose – The paper addresses the practical problems which emerge when attempting to apply longitudinal approaches to the assessment of property depreciation using valuation-based data. These problems relate to inconsistent valuation regimes and the difficulties in finding appropriate benchmarks. Design/methodology/approach – The paper adopts a case study of seven major office locations around Europe and attempts to determine ten-year rental value depreciation rates based on a longitudinal approach using IPD, CBRE and BNP Paribas datasets. Findings – The depreciation rates range from a 5 per cent PA depreciation rate in Frankfurt to a 2 per cent appreciation rate in Stockholm. The results are discussed in the context of the difficulties in applying this method with inconsistent data. Research limitations/implications – The paper has methodological implications for measuring property investment depreciation and provides an example of the problems in adopting theoretically sound approaches with inconsistent information. Practical implications – Valuations play an important role in performance measurement and cross border investment decision making and, therefore, knowledge of inconsistency of valuation practice aids decision making and informs any application of valuation-based data in the attainment of depreciation rates. Originality/value – The paper provides new insights into the use of property market valuation data in a cross-border context, insights that previously had been anecdotal and unproven in nature.
Resumo:
The final warming of the stratospheric polar vortex at the end of northern hemisphere winter is examined in ECMWF ERA-Interim reanalysis data and an ensemble of chemistry climate models, using 20 years of data from each. In some years the final warming is found to occur first in the mid-stratosphere, and in others to occur first in the upper stratosphere. The strength of the winter stratospheric polar vortex, refraction of planetary waves, and the altitudes at which the planetary waves break in the northern extratropics lead to this difference in the vertical profile of the final warming. Years in which the final warming occurs first in the mid-stratosphere show, on average, a more negative NAO pattern in April mean sea level pressure than years in which the warming occurs first in the upper stratosphere. Thus, in the northern hemisphere, additional predictive skill of tropospheric climate in April can be gained from a knowledge of the vertical profile of the stratospheric final warming.
Resumo:
Background Efficient gene expression involves a trade-off between (i) premature termination of protein synthesis; and (ii) readthrough, where the ribosome fails to dissociate at the terminal stop. Sense codons that are similar in sequence to stop codons are more susceptible to nonsense mutation, and are also likely to be more susceptible to transcriptional or translational errors causing premature termination. We therefore expect this trade-off to be influenced by the number of stop codons in the genetic code. Although genetic codes are highly constrained, stop codon number appears to be their most volatile feature. Results In the human genome, codons readily mutable to stops are underrepresented in coding sequences. We construct a simple mathematical model based on the relative likelihoods of premature termination and readthrough. When readthrough occurs, the resultant protein has a tail of amino acid residues incorrectly added to the C-terminus. Our results depend strongly on the number of stop codons in the genetic code. When the code has more stop codons, premature termination is relatively more likely, particularly for longer genes. When the code has fewer stop codons, the length of the tail added by readthrough will, on average, be longer, and thus more deleterious. Comparative analysis of taxa with a range of stop codon numbers suggests that genomes whose code includes more stop codons have shorter coding sequences. Conclusions We suggest that the differing trade-offs presented by alternative genetic codes may result in differences in genome structure. More speculatively, multiple stop codons may mitigate readthrough, counteracting the disadvantage of a higher rate of nonsense mutation. This could help explain the puzzling overrepresentation of stop codons in the canonical genetic code and most variants.