32 resultados para cut-off value
Resumo:
Carsberg (2002) suggested that the periodic valuation accuracy studies undertaken by, amongst others, IPD/Drivers Jonas (2003) should be undertaken every year and be sponsored by the RICS, which acts as the self-regulating body for valuations in the UK. This paper does not address the wider issues concerning the nature of properties which are sold and whether the sale prices are influenced by prior valuations, but considers solely the technical issues concerning the timing of the valuation and sales data. This study uses valuations and sales data from the Investment Property Databank UK Monthly Index to attempt to identify the date that sale data is divulged to valuers. This information will inform accuracy studies that use a cut-off date as to the closeness of valuations to sales completion date as a yardstick for excluding data from the analysis. It will also, assuming valuers are informed quickly of any agreed sales, help to determine the actual sale agreed date rather than the completion date, which includes a period of due diligence between when the sale is agreed and its completion. Valuations should be updated to this date, rather than the formal completion date, if a reliable measure of valuation accuracy is to be determined. An accuracy study is then undertaken using a variety of updating periods and the differences between the results are examined. The paper concludes that the sale only becomes known to valuers in the month prior to the sale taking place and that this assumes either that sales due diligence procedures are shortening or valuers are not told quickly of agreed sale prices. Studies that adopt a four-month cut-off date for any valuations compared to sales completion dates are over cautious, and this could be reduced to two months without compromising the data.
Resumo:
In this paper we derive novel approximations to trapped waves in a two-dimensional acoustic waveguide whose walls vary slowly along the guide, and at which either Dirichlet (sound-soft) or Neumann (sound-hard) conditions are imposed. The guide contains a single smoothly bulging region of arbitrary amplitude, but is otherwise straight, and the modes are trapped within this localised increase in width. Using a similar approach to that in Rienstra (2003), a WKBJ-type expansion yields an approximate expression for the modes which can be present, which display either propagating or evanescent behaviour; matched asymptotic expansions are then used to derive connection formulae which bridge the gap across the cut-off between propagating and evanescent solutions in a tapering waveguide. A uniform expansion is then determined, and it is shown that appropriate zeros of this expansion correspond to trapped mode wavenumbers; the trapped modes themselves are then approximated by the uniform expansion. Numerical results determined via a standard iterative method are then compared to results of the full linear problem calculated using a spectral method, and the two are shown to be in excellent agreement, even when $\epsilon$, the parameter characterising the slow variations of the guide’s walls, is relatively large.
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
Tropical-extratropical cloud band systems over southern Africa, known as tropical temperate troughs (TTTs), are known to contribute substantially to South African summer rainfall. This study performs a comprehensive assessment of the seasonal cycle and rainfall contribution of TTTs by using a novel object-based strategy that explicitly tracks these systems for their full life cycle. The methodology incorporates a simple assignment of station rainfall data to each event, thereby creating a database containing detailed rainfall characteristics for each TTT. This is used to explore the importance of TTTs for rain days and climatological rainfall totals in October–March. Average contributions range from 30 to 60 % with substantial spatial heterogeneity observed. TTT rainfall contributions over the Highveld and eastern escarpment are lower than expected. A short analysis of TTT rainfall variability indicates TTTs provide substantial, but not dominant, intraseasonal and interannual variability in station rainfall totals. TTTs are however responsible for a high proportion of heavy rainfall days. Of 52 extreme rainfall events in the 1979–1999 period, 30 are associated with these tropical-extratropical interactions. Cut-off lows were included in the evolution of 6 of these TTTs. The study concludes with an analysis of the question: does the Madden-Julian Oscillation influence the intensity of TTT rainfall over South Africa? Results suggest a weak but significant suppression (enhancement) of intensity during phase 1(6).
Resumo:
Prediction mechanism is necessary for human visual motion to compensate a delay of sensory-motor system. In a previous study, “proactive control” was discussed as one example of predictive function of human beings, in which motion of hands preceded the virtual moving target in visual tracking experiments. To study the roles of the positional-error correction mechanism and the prediction mechanism, we carried out an intermittently-visual tracking experiment where a circular orbit is segmented into the target-visible regions and the target-invisible regions. Main results found in this research were following. A rhythmic component appeared in the tracer velocity when the target velocity was relatively high. The period of the rhythm in the brain obtained from environmental stimuli is shortened more than 10%. The shortening of the period of rhythm in the brain accelerates the hand motion as soon as the visual information is cut-off, and causes the precedence of hand motion to the target motion. Although the precedence of the hand in the blind region is reset by the environmental information when the target enters the visible region, the hand motion precedes the target in average when the predictive mechanism dominates the error-corrective mechanism.
Resumo:
A typical feature of the atmospheric circulation at middle and high latitudes is a tendency to fluctuate between two rather extreme circulation patterns. This behaviour of the atmosphere is most common at the Northern Hemisphere during the winter and has been known among the meteorologists for a considerable time (e.g. Garriott (1904)). One of these two states is identified by a predominantly zonal circulation or a so-called high-index circulation, the other state by a meridional or a low-index circulation. The meridional circulation is often broken up in a characteristic atmospheric pattern of cut-off lows and highs. These features usually have a time scale of several days during which they affect the weather in a very dominating way. The transition from the zonal to the meridional or cellular circulation is very characteristic and follows a very typical chain of events.
Resumo:
The frequencies of atmospheric blocking in both winter and summer and the changes in them from the 20th to the 21st centuries as simulated in twelve CMIP5 models is analysed. The RCP 8.5 high emission scenario runs are used to represent the 21st century. The analysis is based on the wave-breaking methodology of Pelly and Hoskins (2003a). It differs from the Tibaldi and Molteni (1990) index in viewing equatorward cut-off lows and poleward blocking highs in equal manner as indicating a disruption to the westerlies. 1-dimensional and 2-dimensional diagnostics are applied to identify blocking of the mid-latitude storm-track and also at higher latitudes. Winter blocking frequency is found to be generally underestimated. The models give a decrease in the European blocking maximum in the 21st century, consistent with the results in other studies. There is a mean 21st century winter poleward shift of high- latitude blocking, but little agreement between the models on the details. In summer, Eurasian blocking is also underestimated in the models, whereas it is now too large over the high-latitude ocean basins. A decrease in European blocking frequency in the 21st century model runs is again found. However in summer there is a clear eastward shift of blocking over Eastern Europe and Western Russia, in a region close to the blocking that dominated the Russian summer of 2010. While summer blocking decreases in general, the poleward shift of the storm track into the region of frequent high latitude blocking may mean that the incidence of storms being obstructed by blocks may actually increase.
Resumo:
The functional food market is growing rapidly and membrane processing offers several advantages over conventional methods for separation, fractionation and recovery of bioactive components. The aim of the present study was to select a process that could be implemented easily on an industrial scale for the isolation of natural lactose-derived oligosaccharides (OS) from caprine whey, enabling the development of functional foods for clinical and infant nutrition. The most efficient process was the combination of a pre-treatment to eliminate proteins and fat, using an ultrafiltration (UF) membrane of 25 kDa molecular weight cut off (MWCO), followed by a tighter UF membrane with 1 kDa MWCO. Circa 90% of the carbohydrates recovered in the final retentate were OS. Capillary electrophoresis was used to evaluate the OS profile in this retentate. The combined membrane-processing system is thus a promising technique for obtaining natural concentrated OS from whey. Powered
Resumo:
This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.
Resumo:
Objective To test whether gut permeability is increased in autism spectrum disorders (ASD) by evaluating gut permeability in a population-derived cohort of children with ASD compared with age- and intelligence quotient-matched controls without ASD but with special educational needs (SEN). Patients and Methods One hundred thirty-three children aged 10–14 years, 103 with ASD and 30 with SEN, were given an oral test dose of mannitol and lactulose and urine collected for 6 hr. Gut permeability was assessed by measuring the urine lactulose/mannitol (L/M) recovery ratio by electrospray mass spectrometry-mass spectrometry. The ASD group was subcategorized for comparison into those without (n = 83) and with (n = 20) regression. Results There was no significant difference in L/M recovery ratio (mean (95% confidence interval)) between the groups with ASD: 0.015 (0.013–0.018), and SEN: 0.014 (0.009–0.019), nor in lactulose, mannitol, or creatinine recovery. No significant differences were observed in any parameter for the regressed versus non-regressed ASD groups. Results were consistent with previously published normal ranges. Eleven children (9/103 = 8.7% ASD and 2/30 = 6.7% SEN) had L/M recovery ratio > 0.03 (the accepted normal range cut-off), of whom two (one ASD and one SEN) had more definitely pathological L/M recovery ratios > 0.04. Conclusion There is no statistically significant group difference in small intestine permeability in a population cohort-derived group of children with ASD compared with a control group with SEN. Of the two children (one ASD and one SEN) with an L/M recovery ratio of > 0.04, one had undiagnosed asymptomatic celiac disease (ASD) and the other (SEN) past extensive surgery for gastroschisis.
Resumo:
The propagation of 7.335 MHz, c.w. signals over a 5212 km sub-auroral, west-east path is studied. Measurements and semi-empirical predictions are made of the amplitude distributions and Doppler shifts of the received signals. The observed amplitude distribution is fitted with one produced by a numerical fading model, yielding the power losses suffered by the signals during propagation via the predominating modes. The signals are found to suffer exceptionally low losses at certain local times under geomagnetically quiet conditions. The mid-latitude trough in the F2 peak ionization density is predicted by a statistical model to be at the latitudes of this path at these times and at low Kp values. A sharp cut-off in low-power losses at a mean Kp of 2.75 strongly implicates the trough in the propagation of these signals. The Doppler shifts observed at these times cannot be explained by a simple ray-tracing model. It is shown however, that a simple extension of this model to allow for the trough can reproduce the form of the observed diurnal variation.
Resumo:
Treffers-Daller and Korybski propose to operationalize language dominance on the basis of measures of lexical diversity, as computed, in this particular study, on transcripts of stories told by Polish-English bilinguals in each of their languages They compute four different Indices of Language Dominance (ILD) on the basis of two different measures of lexical diversity, the Index of Guiraud (Guiraud, 1954) and HD-D (McCarthy & Jarvis, 2007). They compare simple indices, which are based on subtracting scores from one language from scores for another language, to more complex indices based on the formula Birdsong borrowed from the field of handedness, namely the ratio of (Difference in Scores) / (Sum of Scores). Positive scores on each of these Indices of Language Dominance mean that informants are more English-dominant and negative scores that they are more Polish-dominant. The authors address the difficulty of comparing scores across languages by carefully lemmatizing the data. Following Flege, Mackay and Piske (2002) they also look into the validity of these indices by investigating to what extent they can predict scores on other, independently measured variables. They use correlations and regression analysis for this, which has the advantage that the dominance indices are used as continuous variables and arbitrary cut-off points between balanced and dominant bilinguals need not be chosen. However, they also show how the computation of z-scores can help facilitate a discussion about the appropriateness of different cut-off points across different data sets and measurement scales in those cases where researchers consider it necessary to make categorial distinctions between balanced and dominant bilinguals. Treffers-Daller and Korybski correlate the ILD scores with four other variables, namely Length of Residence in the UK, attitudes towards English and life in the UK, frequency of usage of English at home and frequency of code-switching. They found that the indices correlated significantly with most of these variables, but there were clear differences between the Guiraud-based indices and the HDD-based indices. In a regression analysis three of the measures were also found to be a significant predictor of English language usage at home. They conclude that the correlations and the regression analyses lend strong support to the validity of their approach to language dominance.
Resumo:
In the present study, to shed light on a role of positional error correction mechanism and prediction mechanism in the proactive control discovered earlier, we carried out a visual tracking experiment, in which the region where target was shown, was regulated in a circular orbit. Main results found in this research were following. Recognition of a time step, obtained from the environmental stimuli, is required for the predictive function. The period of the rhythm in the brain obtained from environmental stimuli is shortened about 10%, when the visual information is cut-off. The shortening of the period of the rhythm in the brain accelerates the motion as soon as the visual information is cut-off, and lets the hand motion precedes the target motion. Although the precedence of the hand in the blind region is reset by the environmental information when the target enters the visible region, the hand precedes in average the target when the predictive mechanism dominates the error-corrective mechanism.
Resumo:
Compulsive Internet Use (CIU) has been mostly studied among adolescents, yet some studies reveal that this can be a problem for the adult population, too. The lack of agreement on diagnostic tools and cut-off points results in markedly different prevalence figures. Building on Charlton’s (2002) distinction between core CIU and positive engagement dimensions, the first objective was to confirm that prevalence figures including the core dimensions of CIU were lower than those including the engagement dimensions as well. Second, building on Davis’s (2001) diathesis-stress model, we tested the role that self-concept clarity (SCC) and social support play in predicting core CIU in US subjects (NUS = 268). Finally, we expected that, because self-concept clarity is mostly linked to well-being in Western countries, the association between this variable and core CIU would be weak in the Eastern culture sample (NUAE = 270). Our findings confirmed that prevalence figures were 20–40% lower when including the core dimensions only, and that SCC is a key predictor of CIU at low levels of social support in the US. We also confirmed that this is not the case in the UAE. Future research opportunities to advance this study were discussed.
Resumo:
We use sunspot group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups RB above a variable cut-off threshold of observed total whole-spot area (uncorrected for foreshortening) to simulate what a lower acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number RA using a variety of regression techniques. It is found that a very high correlation between RA and RB (rAB > 0.98) does not prevent large errors in the intercalibration (for example sunspot maximum values can be over 30 % too large even for such levels of rAB). In generating the backbone sunspot number (RBB), Svalgaard and Schatten (2015, this issue) force regression fits to pass through the scatter plot origin which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile (“Q Q”) plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.