248 resultados para Precise ephemerides
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.
Resumo:
In phylogenetics, the unrooted model of phylogeny and the strict molecular clock model are two extremes of a continuum. Despite their dominance in phylogenetic inference, it is evident that both are biologically unrealistic and that the real evolutionary process lies between these two extremes. Fortunately, intermediate models employing relaxed molecular clocks have been described. These models open the gate to a new field of “relaxed phylogenetics.” Here we introduce a new approach to performing relaxed phylogenetic analysis. We describe how it can be used to estimate phylogenies and divergence times in the face of uncertainty in evolutionary rates and calibration times. Our approach also provides a means for measuring the clocklikeness of datasets and comparing this measure between different genes and phylogenies. We find no significant rate autocorrelation among branches in three large datasets, suggesting that autocorrelated models are not necessarily suitable for these data. In addition, we place these datasets on the continuum of clocklikeness between a strict molecular clock and the alternative unrooted extreme. Finally, we present analyses of 102 bacterial, 106 yeast, 61 plant, 99 metazoan, and 500 primate alignments. From these we conclude that our method is phylogenetically more accurate and precise than the traditional unrooted model while adding the ability to infer a timescale to evolution.
Resumo:
In order to support intelligent transportation system (ITS) road safety applications such as collision avoidance, lane departure warnings and lane keeping, Global Navigation Satellite Systems (GNSS) based vehicle positioning system has to provide lane-level (0.5 to 1 m) or even in-lane-level (0.1 to 0.3 m) accurate and reliable positioning information to vehicle users. However, current vehicle navigation systems equipped with a single frequency GPS receiver can only provide road-level accuracy at 5-10 meters. The positioning accuracy can be improved to sub-meter or higher with the augmented GNSS techniques such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP) which have been traditionally used in land surveying and or in slowly moving environment. In these techniques, GNSS corrections data generated from a local or regional or global network of GNSS ground stations are broadcast to the users via various communication data links, mostly 3G cellular networks and communication satellites. This research aimed to investigate the precise positioning system performances when operating in the high mobility environments. This involves evaluation of the performances of both RTK and PPP techniques using: i) the state-of-art dual frequency GPS receiver; and ii) low-cost single frequency GNSS receiver. Additionally, this research evaluates the effectiveness of several operational strategies in reducing the load on data communication networks due to correction data transmission, which may be problematic for the future wide-area ITS services deployment. These strategies include the use of different data transmission protocols, different correction data format standards, and correction data transmission at the less-frequent interval. A series of field experiments were designed and conducted for each research task. Firstly, the performances of RTK and PPP techniques were evaluated in both static and kinematic (highway with speed exceed 80km) experiments. RTK solutions achieved the RMS precision of 0.09 to 0.2 meter accuracy in static and 0.2 to 0.3 meter in kinematic tests, while PPP reported 0.5 to 1.5 meters in static and 1 to 1.8 meter in kinematic tests by using the RTKlib software. These RMS precision values could be further improved if the better RTK and PPP algorithms are adopted. The tests results also showed that RTK may be more suitable in the lane-level accuracy vehicle positioning. The professional grade (dual frequency) and mass-market grade (single frequency) GNSS receivers were tested for their performance using RTK in static and kinematic modes. The analysis has shown that mass-market grade receivers provide the good solution continuity, although the overall positioning accuracy is worse than the professional grade receivers. In an attempt to reduce the load on data communication network, we firstly evaluate the use of different correction data format standards, namely RTCM version 2.x and RTCM version 3.0 format. A 24 hours transmission test was conducted to compare the network throughput. The results have shown that 66% of network throughput reduction can be achieved by using the newer RTCM version 3.0, comparing to the older RTCM version 2.x format. Secondly, experiments were conducted to examine the use of two data transmission protocols, TCP and UDP, for correction data transmission through the Telstra 3G cellular network. The performance of each transmission method was analysed in terms of packet transmission latency, packet dropout, packet throughput, packet retransmission rate etc. The overall network throughput and latency of UDP data transmission are 76.5% and 83.6% of TCP data transmission, while the overall accuracy of positioning solutions remains in the same level. Additionally, due to the nature of UDP transmission, it is also found that 0.17% of UDP packets were lost during the kinematic tests, but this loss doesn't lead to significant reduction of the quality of positioning results. The experimental results from the static and the kinematic field tests have also shown that the mobile network communication may be blocked for a couple of seconds, but the positioning solutions can be kept at the required accuracy level by setting of the Age of Differential. Finally, we investigate the effects of using less-frequent correction data (transmitted at 1, 5, 10, 15, 20, 30 and 60 seconds interval) on the precise positioning system. As the time interval increasing, the percentage of ambiguity fixed solutions gradually decreases, while the positioning error increases from 0.1 to 0.5 meter. The results showed the position accuracy could still be kept at the in-lane-level (0.1 to 0.3 m) when using up to 20 seconds interval correction data transmission.
Resumo:
An earlier study by the Asian Development Bank (ADB) showed that the annual cost of road traffic accidents in 2001 was S$699.36 million which was 0.5% of the annual GDP. This paper attempts to update of the cost estimates of road traffic accidents. More precise methods of computing the human cost, lost output and property damage are adopted which grew in an annual cost of S$610.3 million or 0.338% of the annual GDP in 2003. A more conservative estimate of S$878,000 for fatal accident is also obtained, compared to the earlier figure of S$1.4 million. This study has shown that it is necessary to update the annual traffic accident costs regularly, as the figures vary with the number of accidents which change with time.
Resumo:
BACKGROUND: Hallux valgus (HV) is a foot deformity commonly seen in medical practice, often accompanied by significant functional disability and foot pain. Despite frequent mention in a diverse body of literature, a precise estimate of the prevalence of HV is difficult to ascertain. The purpose of this systematic review was to investigate prevalence of HV in the overall population and evaluate the influence of age and gender. METHODS: Electronic databases (Medline, Embase, and CINAHL) and reference lists of included papers were searched to June 2009 for papers on HV prevalence without language restriction. MeSH terms and keywords were used relating to HV or bunions, prevalence and various synonyms. Included studies were surveys reporting original data for prevalence of HV or bunions in healthy populations of any age group. Surveys reporting prevalence data grouped with other foot deformities and in specific disease groups (e.g. rheumatoid arthritis, diabetes) were excluded. Two independent investigators quality rated all included papers on the Epidemiological Appraisal Instrument. Data on raw prevalence, population studied and methodology were extracted. Prevalence proportions and the standard error were calculated, and meta-analysis was performed using a random effects model. RESULTS: A total of 78 papers reporting results of 76 surveys (total 496,957 participants) were included and grouped by study population for meta-analysis. Pooled prevalence estimates for HV were 23% in adults aged 18-65 years (CI: 16.3 to 29.6) and 35.7% in elderly people aged over 65 years (CI: 29.5 to 42.0). Prevalence increased with age and was higher in females [30% (CI: 22 to 38)] compared to males [13% (CI: 9 to 17)]. Potential sources of bias were sampling method, study quality and method of HV diagnosis. CONCLUSIONS: Notwithstanding the wide variation in estimates, it is evident that HV is prevalent; more so in females and with increasing age. Methodological quality issues need to be addressed in interpreting reports in the literature and in future research.
Resumo:
STUDY DESIGN: Controlled laboratory study. OBJECTIVES: To investigate the reliability and concurrent validity of photographic measurements of hallux valgus angle compared to radiographs as the criterion standard. BACKGROUND: Clinical assessment of hallux valgus involves measuring alignment between the first toe and metatarsal on weight-bearing radiographs or visually grading the severity of deformity with categorical scales. Digital photographs offer a noninvasive method of measuring deformity on an exact scale; however, the validity of this technique has not previously been established. METHODS: Thirty-eight subjects (30 female, 8 male) were examined (76 feet, 54 with hallux valgus). Computer software was used to measure hallux valgus angle from digital records of bilateral weight-bearing dorsoplantar foot radiographs and photographs. One examiner measured 76 feet on 2 occasions 2 weeks apart, and a second examiner measured 40 feet on a single occasion. Reliability was investigated by intraclass correlation coefficients and validity by 95% limits of agreement. The Pearson correlation coefficient was also calculated. RESULTS: Intrarater and interrater reliability were very high (intraclass correlation coefficients greater than 0.96) and 95% limits of agreement between photographic and radiographic measurements were acceptable. Measurements from photographs and radiographs were also highly correlated (Pearson r = 0.96). CONCLUSIONS: Digital photographic measurements of hallux valgus angle are reliable and have acceptable validity compared to weight-bearing radiographs. This method provides a convenient and precise tool in assessment of hallux valgus, while avoiding the cost and radiation exposure associated with radiographs.
Resumo:
Modern toxicology investigates a wide array of both old and new health hazards. Priority setting is needed to select agents for research from the plethora of exposure circumstances. The changing societies and a growing fraction of the aged have to be taken into consideration. A precise exposure assessment is of importance for risk estimation and regulation. Toxicology contributes to the exploration of pathomechanisms to specify the exposure metrics for risk estimation. Combined effects of co-existing agents are not yet sufficiently understood. Animal experiments allow a separate administration of agents which can not be disentangled by epidemiological means, but their value is limited for low exposure levels in many of today’s settings. As an experimental science, toxicology has to keep pace with the rapidly growing knowledge about the language of the genome and the changing paradigms in cancer development. During the pioneer era of assembling a working draft of the human genome, toxicogenomics has been developed. Gene and pathway complexity have to be considered when investigating gene–environment interactions. For a best conduct of studies, modern toxicology needs a close liaison with many other disciplines like epidemiology and bioinformatics.
Resumo:
Topographic structural complexity of a reef is highly correlated to coral growth rates, coral cover and overall levels of biodiversity, and is therefore integral in determining ecological processes. Modeling these processes commonly includes measures of rugosity obtained from a wide range of different survey techniques that often fail to capture rugosity at different spatial scales. Here we show that accurate estimates of rugosity can be obtained from video footage captured using underwater video cameras (i.e., monocular video). To demonstrate the accuracy of our method, we compared the results to in situ measurements of a 2m x 20m area of forereef from Glovers Reef atoll in Belize. Sequential pairs of images were used to compute fine scale bathymetric reconstructions of the reef substrate from which precise measurements of rugosity and reef topographic structural complexity can be derived across multiple spatial scales. To achieve accurate bathymetric reconstructions from uncalibrated monocular video, the position of the camera for each image in the video sequence and the intrinsic parameters (e.g., focal length) must be computed simultaneously. We show that these parameters can be often determined when the data exhibits parallax-type motion, and that rugosity and reef complexity can be accurately computed from existing video sequences taken from any type of underwater camera from any reef habitat or location. This technique provides an infinite array of possibilities for future coral reef research by providing a cost-effective and automated method of determining structural complexity and rugosity in both new and historical video surveys of coral reefs.
Resumo:
Bioclastic flow deposits offshore from the Soufrie`re Hills volcano on Montserrat in the Lesser Antilles were deposited by the largest volume sediment flows near this active volcano in the last 26 kyr. The volume of these deposits exceeds that of the largest historic volcanic dome collapse in the world, which occurred on Montserrat in 2003. These flows were most probably generated by a large submarine slope failure of the carbonate shelf comprising the south west flank of Antigua or the east flank of Redonda; adjacent islands that are not volcanically active. The bioclastic flow deposits are relatively coarse-grained and either ungraded or poorly graded, and were deposited by non cohesive debris flow and high density turbidity currents. The bioclastic deposit often comprises multiple sub-units that cannot be correlated between core sites; some located just 2 km apart. Multiple sub-units in the bioclastic deposit result from either flow reflection, stacking of multiple debris flow lobes, and/or multi-stage collapse of the initial landslide. This study provides unusually precise constraints on the age of this mass flow event that occurred at ca 14 ka. Few large submarine landslides have been well dated, but the slope failures that have been dated are commonly associated with periods of rapid sea-level change.
Resumo:
A software tool (DRONE) has been developed to evaluate road traffic noise in a large area with the consideration of network dynamic traffic flow and the buildings. For more precise estimation of noise in urban network where vehicles are mainly in stop and go running conditions, vehicle sound power level (for acceleration/deceleration cruising and ideal vehicle) is incorporated in DRONE. The calculation performance of DRONE is increased by evaluating the noise in two steps of first estimating the unit noise database and then integrating it with traffic simulation. Details of the process from traffic simulation to contour maps are discussed in the paper and the implementation of DRONE on Tsukuba city is presented.
Resumo:
Starting from the vantage point that explaining success at creating a venture should be the unique contribution—or at least one unique contribution—of entrepreneurship research, we argue that this success construct has not yet been adequately defined an operationalized. We thus offer suggestions for more precise conceptualization and measurement of this central construct. Rather than regarding various success proxies used in prior research as poor operationalizations of success we argue that they represent other important aspects of the venture creation process: engagement, persistence and progress. We hold that in order to attain a better understanding of venture creation these constructs also need to be theoretically defined. Further, their respective drivers need to be theorized and tested separately. We suggest theoretical definitions of each. We then develop and test hypotheses concerning how human capital, venture idea novelty and business planning has different impact on the different assessments of the process represented by engagement, persistence, progress and success. The results largely confirm the stated hypotheses, suggesting that the conceptual and empirical approach we are suggesting is a path towards improved understanding of the central entrepreneurship phenomenon of new venture creation.
Resumo:
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.
Resumo:
The importance of the environment to the fulfilment of human rights is widely accepted at international law. What is less well-accepted is the proposition that we, as humans, possess rights to the environment beyond what is necessary to support our basic human needs. The suggestion that a human right to a healthy environment may be emerging at international law raises a number of theoretical and practical challenges for human rights law, with such challenges coming from both within and outside the human rights discourse. It is argued that human rights law can make a positive contribution to environmental protection, but the precise nature of the connection between the environment and human rights warrants more critical analysis. This short paper considers the different ways that the environment is conceptualised in international human rights law and analyses the proposition that a right to a healthy environment is emerging. It identifies some of the challenges which would need to be overcome before such a right could be recognised, including those which draw on the disciplines of deep ecology and earth jurisprudence.
Resumo:
Traffic safety studies mandate more than what existing micro-simulation models can offer as they postulate that every driver exhibits a safe behaviour. All the microscopic traffic simulation models are consisting of a car-following model and the Gazis–Herman–Rothery (GHR) car-following model is a widely used model. This paper highlights the limitations of the GHR car-following model capability to model longitudinal driving behaviour for safety study purposes. This study reviews and compares different version of the GHR model. To empower the GHR model on precise metrics reproduction a new set of car-following model parameters is offered to simulate unsafe vehicle conflicts. NGSIM vehicle trajectory data is used to evaluate the new model and short following headways and Time to Collision are employed to assess critical safety events within traffic flow. Risky events are extracted from available NGSIM data to evaluate the modified model against the generic versions of the GHR model. The results from simulation tests illustrate that the proposed model does predict the safety metrics better than the generic GHR model. Additionally it can potentially facilitate assessing and predicting traffic facilities’ safety using microscopic simulation. The new model can predict Near-miss rear-end crashes.
Resumo:
Gait freezing is an episodic arrest of locomotion due to an inability to take normal steps. Pedunculopontine nucleus stimulation is an emerging therapy proposed to improve gait freezing, even where refractory to medication. However, the efficacy and precise effects of pedunculopontine nucleus stimulation on Parkinsonian gait disturbance are not established. The clinical application of this new therapy is controversial and it is unknown if bilateral stimulation is more effective than unilateral. Here, in a double-blinded study using objective spatiotemporal gait analysis, we assessed the impact of unilateral and bilateral pedunculopontine nucleus stimulation on triggered episodes of gait freezing and on background deficits of unconstrained gait in Parkinson’s disease. Under experimental conditions, while OFF medication, Parkinsonian patients with severe gait freezing implanted with pedunculopontine nucleus stimulators below the pontomesencephalic junction were assessed during three conditions; off stimulation, unilateral stimulation and bilateral stimulation. Results were compared to Parkinsonian patients without gait freezing matched for disease severity and healthy controls. Pedunculopontine nucleus stimulation improved objective measures of gait freezing, with bilateral stimulation more effective than unilateral. During unconstrained walking, Parkinsonian patients who experience gait freezing had reduced step length and increased step length variability compared to patients without gait freezing; however, these deficits were unchanged by pedunculopontine nucleus stimulation. Chronic pedunculopontine nucleus stimulation improved Freezing of Gait Questionnaire scores, reflecting a reduction of the freezing encountered in patients’ usual environments and medication states. This study provides objective, double-blinded evidence that in a specific subgroup of Parkinsonian patients, stimulation of a caudal pedunculopontine nucleus region selectively improves gait freezing but not background deficits in step length. Bilateral stimulation was more effective than unilateral.