934 resultados para performance trend
Resumo:
Converging evidence from epidemiological, clinical and neuropsychological research suggests a link between cannabis use and increased risk of psychosis. Long-term cannabis use has also been related to deficit-like “negative” symptoms and cognitive impairment that resemble some of the clinical and cognitive features of schizophrenia. The current functional brain imaging study investigated the impact of a history of heavy cannabis use on impaired executive function in first-episode schizophrenia patients. Whilst performing the Tower of London task in a magnetic resonance imaging scanner, event-related blood oxygenation level-dependent (BOLD) brain activation was compared between four age and gender-matched groups: 12 first-episode schizophrenia patients; 17 long-term cannabis users; seven cannabis using first-episode schizophrenia patients; and 17 healthy control subjects. BOLD activation was assessed as a function of increasing task difficulty within and between groups as well as the main effects of cannabis use and the diagnosis of schizophrenia. Cannabis users and non-drug using first-episode schizophrenia patients exhibited equivalently reduced dorsolateral prefrontal activation in response to task difficulty. A trend towards additional prefrontal and left superior parietal cortical activation deficits was observed in cannabis-using first-episode schizophrenia patients while a history of cannabis use accounted for increased activation in the visual cortex. Cannabis users and schizophrenia patients fail to adequately activate the dorsolateral prefrontal cortex, thus pointing to a common working memory impairment which is particularly evident in cannabis-using first-episode schizophrenia patients. A history of heavy cannabis use, on the other hand, accounted for increased primary visual processing, suggesting compensatory imagery processing of the task.
Resumo:
One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. As design criteria transition from empirical to mechanistic-empirical, soil test methods and equipment that measure properties such as stiffness and modulus and how they relate to Florida materials are needed. Requirements for the selected equipment are that they be portable, cost effective, reliable, a ccurate, and repeatable. A second objective is that the selected equipment measures soil properties without the use of nuclear materials.The current device used to measure soil compaction is the nuclear density gauge (NDG). Equipment evaluated in this research included lightweight deflectometers (LWD) from different manufacturers, a dynamic cone penetrometer (DCP), a GeoGauge, a Clegg impact soil tester (CIST), a Briaud compaction device (BCD), and a seismic pavement analyzer (SPA). Evaluations were conducted over ranges of measured densities and moistures.Testing (Phases I and II) was conducted in a test box and test pits. Phase III testing was conducted on materials found on five construction projects located in the Jacksonville, Florida, area. Phase I analyses determined that the GeoGauge had the lowest overall coefficient of variance (COV). In ascending order of COV were the accelerometer-type LWD, the geophone-type LWD, the DCP, the BCD, and the SPA which had the highest overall COV. As a result, the BCD and the SPA were excluded from Phase II testing.In Phase II, measurements obtained from the selected equipment were compared to the modulus values obtained by the static plate load test (PLT), the resilient modulus (MR) from laboratory testing, and the NDG measurements. To minimize soil and moisture content variability, the single spot testing sequence was developed. At each location, test results obtained from the portable equipment under evaluation were compared to the values from adjacent NDG, PLT, and laboratory MR measurements. Correlations were developed through statistical analysis. Target values were developed for various soils for verification on similar soils that were field tested in Phase III. The single spot testing sequence also was employed in Phase III, field testing performed on A-3 and A-2-4 embankments, limerock-stabilized subgrade, limerock base, and graded aggregate base found on Florida Department of Transportation construction projects. The Phase II and Phase III results provided potential trend information for future research—specifically, data collection for in-depth statistical analysis for correlations with the laboratory MR for specific soil types under specific moisture conditions. With the collection of enough data, stronger relationships could be expected between measurements from the portable equipment and the MR values. Based on the statistical analyses and the experience gained from extensive use of the equipment, the combination of the DCP and the LWD was selected for in-place soil testing for compaction control acceptance. Test methods and developmental specifications were written for the DCP and the LWD. The developmental specifications include target values for the compaction control of embankment, subgrade, and base materials.
Resumo:
This study uses the reverse salient methodology to contrast subsystems in video game consoles in order to discover, characterize, and forecast the most significant technology gap. We build on the current methodologies (Performance Gap and Time Gap) for measuring the magnitude of Reverse Salience, by showing the effectiveness of Performance Gap Ratio (PGR). The three subject subsystems in this analysis are the CPU Score, GPU core frequency, and video memory bandwidth. CPU Score is a metric developed for this project, which is the product of the core frequency, number of parallel cores, and instruction size. We measure the Performance Gap of each subsystem against concurrently available PC hardware on the market. Using PGR, we normalize the evolution of these technologies for comparative analysis. The results indicate that while CPU performance has historically been the Reverse Salient, video memory bandwidth has taken over as the quickest growing technology gap in the current generation. Finally, we create a technology forecasting model that shows how much the video RAM bandwidth gap will grow through 2019 should the current trend continue. This analysis can assist console developers in assigning resources to the next generation of platforms, which will ultimately result in longer hardware life cycles.
Resumo:
Change point estimation is recognized as an essential tool of root cause analyses within quality control programs as it enables clinical experts to search for potential causes of change in hospital outcomes more effectively. In this paper, we consider estimation of the time when a linear trend disturbance has occurred in survival time following an in-control clinical intervention in the presence of variable patient mix. To model the process and change point, a linear trend in the survival time of patients who underwent cardiac surgery is formulated using hierarchical models in a Bayesian framework. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. We use Markov Chain Monte Carlo to obtain posterior distributions of the change point parameters including the location and the slope size of the trend and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time cumulative sum control chart (CUSUM) control charts for different trend scenarios. In comparison with the alternatives, step change point model and built-in CUSUM estimator, more accurate and precise estimates are obtained by the proposed Bayesian estimator over linear trends. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.
Resumo:
The third edition of the Australian Standard AS1742 Manual of Uniform Traffic Control Devices Part 7 provides a method of calculating the sighting distance required to safely proceed at passive level crossings based on the physics of moving vehicles. This required distance becomes greater with higher line speeds and slower, heavier vehicles so that it may return quite a long sighting distance. However, at such distances, there are also concerns around whether drivers would be able to reliably identify a train in order to make an informed decision regarding whether it would be safe to proceed across the level crossing. In order to determine whether drivers are able to make reliable judgements to proceed in these circumstances, this study assessed the distance at which a train first becomes identifiable to a driver as well as their, ability to detect the movement of the train. A site was selected in Victoria, and 36 participants with good visual acuity observed 4 trains in the 100-140 km/h range. While most participants could detect the train from a very long distance (2.2 km on average), they could only detect that the train was moving at much shorter distances (1.3 km on average). Large variability was observed between participants, with 4 participants consistently detecting trains later than other participants. Participants tended to improve in their capacity to detect the presence of the train with practice, but a similar trend was not observed for detection of the movement of the train. Participants were consistently poor at accurately judging the approach speed of trains, with large underestimations at all investigated distances.