902 resultados para one-time password
Resumo:
BACKGROUND Patients with downbeat nystagmus syndrome suffer from oscillopsia, which leads to an unstable visual perception and therefore impaired visual acuity. The aim of this study was to use real-time computer-based visual feedback to compensate for the destabilizing slow phase eye movements. METHODS The patients were sitting in front of a computer screen with the head fixed on a chin rest. The eye movements were recorded by an eye tracking system (EyeSeeCam®). We tested the visual acuity with a fixed Landolt C (static) and during real-time feedback driven condition (dynamic) in gaze straight ahead and (20°) sideward gaze. In the dynamic condition, the Landolt C moved according to the slow phase eye velocity of the downbeat nystagmus. The Shapiro-Wilk test was used to test for normal distribution and one-way ANOVA for comparison. RESULTS Ten patients with downbeat nystagmus were included in the study. Median age was 76 years and the median duration of symptoms was 6.3 years (SD +/- 3.1y). The mean slow phase velocity was moderate during gaze straight ahead (1.44°/s, SD +/- 1.18°/s) and increased significantly in sideward gaze (mean left 3.36°/s; right 3.58°/s). In gaze straight ahead, we found no difference between the static and feedback driven condition. In sideward gaze, visual acuity improved in five out of ten subjects during the feedback-driven condition (p = 0.043). CONCLUSIONS This study provides proof of concept that non-invasive real-time computer-based visual feedback compensates for the SPV in DBN. Therefore, real-time visual feedback may be a promising aid for patients suffering from oscillopsia and impaired text reading on screen. Recent technological advances in the area of virtual reality displays might soon render this approach feasible in fully mobile settings.
Resumo:
Although negative density dependence (NDD) can facilitate tree species coexistence in forests, the underlying mechanisms can differ, and rarely are the dynamics of seedlings and saplings studied together. Herein we present and discuss a novel mechanism based on our investigation of NDD predictions for the large, grove-forming ectomycorrhizal mast fruiting tree, Microberlinia bisulcata (Caesalpiniaceae), in an 82.5-ha plot at Korup, Cameroon. We tested whether juvenile density, size, growth and survival decreases with increasing conspecific adult basal area for 3245 ‘new’ seedlings and 540 ‘old’ seedlings (< 75-cm tall) during an approximately 4-year study period (2008–2012) and for 234 ‘saplings’ (≥ 75-cm tall) during an approximately 6-year study period (2008–2014). We found that the respective densities of new seedlings, old seedlings and saplings were positively, not and negatively related to increasing BA. Maximum leaf numbers and heights of old seedlings were negatively correlated with increasing basal areas, as were sapling heights and stem diameters. Whereas survivorship of new seedlings decreased by more than one-half with increasing basal area over its range in 2010–2012, that of old seedlings decreased by almost two-thirds, but only in 2008–2010, and was generally unrelated to conspecific seedling density. In 2010–2012 relative growth rates in new seedlings’ heights decreased with increasing basal area, as well as with increasing seedling density, together with increasing leaf numbers, whereas old seedlings’ growth was unrelated to either conspecific density or basal area. Saplings of below-average height had reduced survivorship with increasing basal area (probability decreasing from approx. 0.4 to 0.05 over the basal area range tested), but only sapling growth in terms of leaf numbers decreased with increasing basal area. These static and dynamic results indicate that NDD is operating within this system, possibly stabilizing the M. bisulcata population. However, these NDD patterns are unlikely to be caused by symmetric competition or by consumers. Instead, an alternative mechanism for conspecific adult–juvenile negative feedback is proposed, one which involves the interaction between tree phenology and ectomycorrhizal linkages.
Resumo:
The goal of this work was to increase the performance and to calibrate one of the ROSINA sensors, the Reflectron-type Time-Of-Flight mass spectrometer, currently flying aboard the ESA Rosetta spacecraft. Different optimization techniques were applied to both the lab and space models, and a static calibration was performed using different gas species expected to be detected in the vicinity of comet 67P/Churyumov-Gerasimenko. The database thus created was successfully applied to space data, giving consistent results with the other ROSINA sensors.
Resumo:
BACKGROUND Data evaluating the chronological order of appearance of extraintestinal manifestations (EIMs) relative to the time of inflammatory bowel disease (IBD) diagnosis is currently lacking. We aimed to assess the type, frequency, and chronological order of appearance of EIMs in patients with IBD. METHODS Data from the Swiss Inflammatory Bowel Disease Cohort Study were analyzed. RESULTS The data on 1249 patients were analyzed (49.8% female, median age: 40 [interquartile range, 30-51 yr], 735 [58.8%] with Crohn's disease, 483 [38.7%] with ulcerative colitis, and 31 [2.5%] with indeterminate colitis). A total of 366 patients presented with EIMs (29.3%). Of those, 63.4% presented with 1, 26.5% with 2, 4.9% with 3, 2.5% with 4, and 2.7% with 5 EIMs during their lifetime. Patients presented with the following diseases as first EIMs: peripheral arthritis 70.0%, aphthous stomatitis 21.6%, axial arthropathy/ankylosing spondylitis 16.4%, uveitis 13.7%, erythema nodosum 12.6%, primary sclerosing cholangitis 6.6%, pyoderma gangrenosum 4.9%, and psoriasis 2.7%. In 25.8% of cases, patients presented with their first EIM before IBD was diagnosed (median time 5 mo before IBD diagnosis: range, 0-25 mo), and in 74.2% of cases, the first EIM manifested itself after IBD diagnosis (median: 92 mo; range, 29-183 mo). CONCLUSIONS In one quarter of patients with IBD, EIMs appeared before the time of IBD diagnosis. Occurrence of EIMs should prompt physicians to look for potential underlying IBD.
Resumo:
PURPOSE In the present case series, the authors report on seven cases of erosively worn dentitions (98 posterior teeth) which were treated with direct resin composite. MATERIALS AND METHODS In all cases, both arches were restored by using the so-called stamp technique. All patients were treated with standardized materials and protocols. Prior to treatment, a waxup was made on die-cast models to build up the loss of occlusion as well as ensure the optimal future anatomy and function of the eroded teeth to be restored. During treatment, teeth were restored by using templates of silicone (ie, two "stamps," one on the vestibular, one on the oral aspect of each tooth), which were filled with resin composite in order to transfer the planned, future restoration (ie, in the shape of the waxup) from the extra- to the intraoral situation. Baseline examinations were performed in all patients after treatment, and photographs as well as radiographs were taken. To evaluate the outcome, the modified United States Public Health Service criteria (USPHS) were used. RESULTS The patients were re-assessed after a mean observation time of 40 months (40.8 ± 7.2 months). The overall outcome of the restorations was good, and almost exclusively "Alpha" scores were given. Only the marginal integrity and the anatomical form received a "Charlie" score (10.2%) in two cases. CONCLUSION Direct resin composite restorations made with the stamp technique are a valuable treatment option for restoring erosively worn dentitions.
Resumo:
The mental speed approach explains individual differences in intelligence by faster information processing in individuals with higher compared to lower intelligence - especially in elementary cognitive tasks (ECTs). One of the most examined ECTs is the Hick paradigm. The present study aimed to contrast reaction time (RT) and P3 latency in a Hick task as predictors of intelligence. Although both, RT and P3 latency, are commonly used as indicators of mental speed, it is also known that they measure different aspects of information processing. Participants were 113 female students. RT and P3 latency were measured while participants completed the Hick task with four levels of complexity. Intelligence was assessed with Cattell's Culture Fair Test. A RT factor and a P3 factor were extracted by employing a PCA across complexity levels. There was no significant correlation between the factors. Commonality analysis was used to determine the proportions of unique and shared variance in intelligence explained by the RT and P3 latency factors. RT and P3 latency explained 5.5% and 5% of unique variance in intelligence. However, the two speed factors did not explain a significant portion of shared variance. This result suggests that RT and P3 latency in the Hick paradigm are measuring different aspects of information processing that explain different parts of variance in intelligence.
Resumo:
In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.
Resumo:
Study abroad has been an established institution in US universities for almost a century, and hundreds of thousands of students travel to all corners of the world every year. While many list some degree of cultural immersion as a main goal, most students have a difficult time achieving this. Drawing from interviews with twenty-five UConn undergraduates that studied abroad, this study attempts to identify factors that hold students back from cultural encounters. The study also discusses the 'success stories' of undergrads that made significant connections abroad, and highlights the factors that can lead to this (e.g., homestays, jobs, internships).
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
We examine the time-series relationship between housing prices in eight Southern California metropolitan statistical areas (MSAs). First, we perform cointegration tests of the housing price indexes for the MSAs, finding seven cointegrating vectors. Thus, the evidence suggests that one common trend links the housing prices in these eight MSAs, a purchasing power parity finding for the housing prices in Southern California. Second, we perform temporal Granger causality tests revealing intertwined temporal relationships. The Santa Anna MSA leads the pack in temporally causing housing prices in six of the other seven MSAs, excluding only the San Luis Obispo MSA. The Oxnard MSA experienced the largest number of temporal effects from other MSAs, six of the seven, excluding only Los Angeles. The Santa Barbara MSA proved the most isolated in that it temporally caused housing prices in only two other MSAs (Los Angels and Oxnard) and housing prices in the Santa Anna MSA temporally caused prices in Santa Barbara. Third, we calculate out-of-sample forecasts in each MSA, using various vector autoregressive (VAR) and vector error-correction (VEC) models, as well as Bayesian, spatial, and causality versions of these models with various priors. Different specifications provide superior forecasts in the different MSAs. Finally, we consider the ability of theses time-series models to provide accurate out-of-sample predictions of turning points in housing prices that occurred in 2006:Q4. Recursive forecasts, where the sample is updated each quarter, provide reasonably good forecasts of turning points.
Resumo:
In this paper we introduce technical efficiency via the intercept that evolve over time as a AR(1) process in a stochastic frontier (SF) framework in a panel data framework. Following are the distinguishing features of the model. First, the model is dynamic in nature. Second, it can separate technical inefficiency from fixed firm-specific effects which are not part of inefficiency. Third, the model allows one to estimate technical change separate from change in technical efficiency. We propose the ML method to estimate the parameters of the model. Finally, we derive expressions to calculate/predict technical inefficiency (efficiency).
Resumo:
The purpose of this research was two-fold; to investigate the effect of institutionalization on death and CD4 decline in a cohort of 325 HIV-infected Romanian children, and to investigate the effect of disclosure of the child's own HIV status in this cohort. All children were treated with Kaletra-based highly active antiretroviral therapy, and were followed from November, 2001 through October, 2004. The mean age of the children included in the cohort is 13. The study found that children in biological families were more likely to experience disease progression through either death or CD4 decline than children in institutions (p=0.04). The family home-style institution may prove to be a replicable model for the safe and appropriate care of HIV-infected orphaned and abandoned children and teens. The study also found that children who do not know their own HIV infection status were more likely to experience disease progression through either death or CD4 decline than children who know their HIV diagnosis (p=0.03). This evidence suggests that, in the context of highly active anti retroviral therapy, knowledge of one's own HIV infection status is associated with delayed HIV disease progression. ^
Resumo:
This descriptive systematic review describes intervention trials for children and youth that targeted screen time (ST) as a way to prevent or control obesity and measured ST, and at least one of the following: physical activity, dietary intake, and adiposity. Both “hands-on” (e.g., video games) and “hands free” (e.g., television viewing) ST were included. Published, completed intervention trials (k=12), not-yet-published, completed trials (k=6), and in-progress trials (k=11) were identified through searches of electronic databases, including trial registries and bibliographies of eligible study reports. Study characteristics of the 29 identified trials were coded and presented in evidence tables. Considerable attention was paid to the type of ST addressed, measures used, and the type of interventions. Based on the number of in-progress and not-yet-published trials, the number of completed, published reports will double in the next three years. Most of the studies were funded by federal sources. General populations, not restricted by race, gender, or weight status, were targets of most interventions with children ages 9-12 yeas as the modal age group. Most trials used randomized control trials in which the majority of control or comparison group received an intervention. The mean number of participants was 242.8 (SD=314.7) and interventions were delivered over an average of 10.5 months and consisted of approximately 16 sessions, with a total time of about eight hours. The majority of completed trials evaluate each of the four constructs, however, most studies have more than one measure to assess each construct (e.g., BMI and tricep skinfold thickness to evaluate adiposity) and rarely did studies use the same measures. This is likely why the majority of studies produced at least one significant intervention effect on each outcome that was assessed. The four major outcomes should be evaluated in all interventions attempting to reduce screen time in order to determine the mechanisms involved that may contribute to obesity. More importantly researchers should work together to determine the best measures to evaluate the four main constructs to allow studies to be compared. Another area for consensus is the definition of ST. ^
Resumo:
Geographic health planning analyses, such as service area calculations, are hampered by a lack of patient-specific geographic data. Using the limited patient address information in patient management systems, planners analyze patient origin based on home address. But activity space research done sparingly in public health and extensively in non-health related arenas uses multiple addresses per person when analyzing accessibility. Also, health care access research has shown that there are many non-geographic factors that influence choice of provider. Most planning methods, however, overlook non-geographic factors influencing choice of provider, and the limited data mean the analyses can only be related to home address. This research attempted to determine to what extent geography plays a part in patient choice of provider and to determine if activity space data can be used to calculate service areas for primary care providers. ^ During Spring 2008, a convenience sample of 384 patients of a locally-funded Community Health Center in Houston, Texas, completed a survey that asked about what factors are important when he or she selects a health care provider. A subset of this group (336) also completed an activity space log that captured location and time data on the places where the patient regularly goes. ^ Survey results indicate that for this patient population, geography plays a role in their choice of health care provider, but it is not the most important reason for choosing a provider. Other factors for choosing a health care provider such as the provider offering "free or low cost visits", meeting "all of the patient's health care needs", and seeing "the patient quickly" were all ranked higher than geographic reasons. ^ Analysis of the patient activity locations shows that activity spaces can be used to create service areas for a single primary care provider. Weighted activity-space-based service areas have the potential to include more patients in the service area since more than one location per patient is used. Further analysis of the logs shows that a reduced set of locations by time and type could be used for this methodology, facilitating ongoing data collection for activity-space-based planning efforts. ^
Resumo:
In 1979, China implemented the one child policy to stifle the burden of the massive demographic growth cast on the future economic development and quality of living conditions. At the time, a quarter of the world's population resided in China and occupied only 7 percent of the world's arable land (The World Factbook, 2006). The government set the target total population to about 1.4 billion for the year 2010 and to significantly reduce the natural increase rate. First this overview paper will describe population demographics and economy of China's society. This paper will also investigate what the one child policy entails and how it is implemented. Furthermore, the consequences of the policy in regard to population growth, sex ratio, marital discrepancies, adverse health of mother and child, aging population, and pension coverage will be examined. Finally, future recommendations and an alternative policy will be postulated to increase the effectiveness of the policy and improve its effects on health. ^