927 resultados para seed retention time
Resumo:
By means of fixed-links modeling, the present study identified different processes of visual short-term memory (VSTM) functioning and investigated how these processes are related to intelligence. We conducted an experiment where the participants were presented with a color change detection task. Task complexity was manipulated through varying the number of presented stimuli (set size). We collected hit rate and reaction time (RT) as indicators for the amount of information retained in VSTM and speed of VSTM scanning, respectively. Due to the impurity of these measures, however, the variability in hit rate and RT was assumed to consist not only of genuine variance due to individual differences in VSTM retention and VSTM scanning but also of other, non-experimental portions of variance. Therefore, we identified two qualitatively different types of components for both hit rate and RT: (1) non-experimental components representing processes that remained constant irrespective of set size and (2) experimental components reflecting processes that increased as a function of set size. For RT, intelligence was negatively associated with the non-experimental components, but was unrelated to the experimental components assumed to represent variability in VSTM scanning speed. This finding indicates that individual differences in basic processing speed, rather than in speed of VSTM scanning, differentiates between high- and low-intelligent individuals. For hit rate, the experimental component constituting individual differences in VSTM retention was positively related to intelligence. The non-experimental components of hit rate, representing variability in basal processes, however, were not associated with intelligence. By decomposing VSTM functioning into non-experimental and experimental components, significant associations with intelligence were revealed that otherwise might have been obscured.
Resumo:
Abstract Claystones are considered worldwide as barrier materials for nuclear waste repositories. In the Mont Terri underground research laboratory (URL), a nearly 4-year diffusion and retention (DR) experiment has been performed in Opalinus Clay. It aimed at (1) obtaining data at larger space and time scales than in laboratory experiments and (2) under relevant in situ conditions with respect to pore water chemistry and mechanical stress, (3) quantifying the anisotropy of in situ diffusion, and (4) exploring possible effects of a borehole-disturbed zone. The experiment included two tracer injection intervals in a borehole perpendicular to bedding, through which traced artificial pore water (APW) was circulated, and a pressure monitoring interval. The APW was spiked with neutral tracers (HTO, HDO, H2O-18), anions (Br, I, SeO4), and cations (Na-22, Ba-133, Sr-85, Cs-137, Co-60, Eu-152, stable Cs, and stable Eu). Most tracers were added at the beginning, some were added at a later stage. The hydraulic pressure in the injection intervals was adjusted according to the measured value in the pressure monitoring interval to ensure transport by diffusion only. Concentration time-series in the APW within the borehole intervals were obtained, as well as 2D concentration distributions in the rock at the end of the experiment after overcoring and subsampling which resulted in �250 samples and �1300 analyses. As expected, HTO diffused the furthest into the rock, followed by the anions (Br, I, SeO4) and by the cationic sorbing tracers (Na-22, Ba-133, Cs, Cs-137, Co-60, Eu-152). The diffusion of SeO4 was slower than that of Br or I, approximately proportional to the ratio of their diffusion coefficients in water. Ba-133 diffused only into �0.1 m during the �4 a. Stable Cs, added at a higher concentration than Cs-137, diffused further into the rock than Cs-137, consistent with a non-linear sorption behavior. The rock properties (e.g., water contents) were rather homogeneous at the centimeter scale, with no evidence of a borehole-disturbed zone. In situ anisotropy ratios for diffusion, derived for the first time directly from field data, are larger for HTO and Na-22 (�5) than for anions (�3�4 for Br and I). The lower ionic strength of the pore water at this location (�0.22 M) as compared to locations of earlier experiments in the Mont Terri URL (�0.39 M) had no notable effect on the anion accessible pore fraction for Cl, Br, and I: the value of 0.55 is within the range of earlier data. Detailed transport simulations involving different codes will be presented in a companion paper.
Resumo:
In several studies of antiretroviral treatment (ART) programs for persons with human immunodeficiency virus infection, investigators have reported that there has been a higher rate of loss to follow-up (LTFU) among patients initiating ART in recent years than among patients who initiated ART during earlier time periods. This finding is frequently interpreted as reflecting deterioration of patient retention in the face of increasing patient loads. However, in this paper we demonstrate by simulation that transient gaps in follow-up could lead to bias when standard survival analysis techniques are applied. We created a simulated cohort of patients with different dates of ART initiation. Rates of ART interruption, ART resumption, and mortality were assumed to remain constant over time, but when we applied a standard definition of LTFU, the simulated probability of being classified LTFU at a particular ART duration was substantially higher in recently enrolled cohorts. This suggests that much of the apparent trend towards increased LTFU may be attributed to bias caused by transient interruptions in care. Alternative statistical techniques need to be used when analyzing predictors of LTFU-for example, using "prospective" definitions of LTFU in place of "retrospective" definitions. Similar considerations may apply when analyzing predictors of LTFU from treatment programs for other chronic diseases.
Resumo:
AIM Several surveys evaluate different retention approaches among orthodontists, but none exist for general dentists. The primary aim of this survey was to record the preferred fixed retainer designs and retention protocols amongst general dentists and orthodontists in Switzerland. A secondary aim was to investigate whether retention patterns were associated with parameters such as gender, university of graduation, time in practice, and specialist status. METHODS An anonymized questionnaire was distributed to general dentists (n = 401) and orthodontists (n = 398) practicing in the German-speaking part of Switzerland. A total of 768 questionnaires could be delivered, 562 (73.2 %) were returned and evaluated. Descriptive statistics were performed and responses to questions of interest were converted to binary outcomes and analyzed using multiple logistic regression. Any associations between the answers and gender, university of graduation (Swiss or foreign), years in practice, and specialist status (orthodontist/general dentist) were assessed. RESULTS Almost all responding orthodontists (98.0 %) and nearly a third of general dentists (29.6 %) reported bonding fixed retainers regularly. The answers were not associated with the practitioner's gender. The university of graduation and number of years in practice had a moderate impact on the responses. The answers were mostly influenced by specialist status. CONCLUSION Graduation school, years in practice, and specialist status influence retention protocol, and evidence-based guidelines for fixed retention should be issued to minimize these effects. Based on the observation that bonding and maintenance of retainers are also performed by general dentists, these guidelines should be taught in dental school and not during post-graduate training.
Resumo:
Seed production, seed dispersal, and seedling recruitment are integral to forest dynamics, especially in masting species. Often these are studied separately, yet scarcely ever for species with ballistic dispersal even though this mode of dispersal is common in legume trees of tropical African rain forests. Here, we studied two dominant main-canopy tree species, Microberlinia bisulcata and Tetraberlinia bifoliolata (Caesalpinioideae), in 25 ha of primary rain forest at Korup, Cameroon, during two successive masting events (2007/2010). In the vicinity of c. 100 and 130 trees of each species, 476/580 traps caught dispersed seeds and beneath their crowns c. 57,000 pod valves per species were inspected to estimate tree-level fecundity. Seed production of trees increased non-linearly and asymptotically with increasing stem diameters. It was unequal within the two species’ populations, and differed strongly between years to foster both spatial and temporal patchiness in seed rain. The M. bisulcata trees could begin seeding at 42–44 cm diameter: at a much larger size than could T. bifoliolata (25 cm). Nevertheless, per capita life-time reproductive capacity was c. five times greater in M. bisulcata than T. bifoliolata owing to former’s larger adult stature, lower mortality rate (despite a shorter life-time) and smaller seed mass. The two species displayed strong differences in their dispersal capabilities. Inverse modelling (IM) revealed that dispersal of M. bisulcata was best described by a lognormal kernel. Most seeds landed at 10–15 m from stems, with 1% of them going beyond 80 m (<100 m). The direct estimates of fecundity significantly improved the models fitted. The lognormal also described well the seedling recruitment distribution of this species in 121 ground plots. By contrast, the lower intensity of masting and more limited dispersal of the heavier-seeded T. bifoliolata prevented reliable IM. For this species, seed density as function of distance to traps suggested a maximum dispersal distance of 40–50 m, and a correspondingly more aggregated seedling recruitment pattern ensued than for M. bisulcata. From this integrated field study, we conclude that the reproductive traits of M. bisulcata give it a considerable advantage over T. bifoliolata by better dispersing more seeds per capita to reach more suitable establishment sites, and combined with other key traits they explain its local dominance in the forest. Understanding the linkages between size at onset of maturity, individual fecundity, and dispersal capability can better inform the life-history strategies, and hence management, of co-occurring tree species in tropical forests.
Resumo:
OBJECTIVE To evaluate treatment response of hepatocellular carcinoma (HCC) after transarterial chemoembolization (TACE) with a new real-time imaging fusion technique of contrast-enhanced ultrasound (CEUS) with multi-slice detection computed tomography (CT) in comparison to conventional post-interventional follow-up. MATERIAL AND METHODS 40 patients with HCC (26 male, ages 46-81 years) were evaluated 24 hours after TACE using CEUS with ultrasound volume navigation and image fusion with CT compared to non-enhanced CT and follow-up contrast-enhanced CT after 6-8 weeks. Reduction of tumor vascularization to less than 25% was regarded as "successful" treatment, whereas reduction to levels >25% was considered as "partial" treatment response. Homogenous lipiodol retention was regarded as successful treatment in non-enhanced CT. RESULTS Post-interventional image fusion of CEUS with CT was feasible in all 40 patients. In 24 patients (24/40), post-interventional image fusion with CEUS revealed residual tumor vascularity, that was confirmed by contrast-enhanced CT 6-8 weeks later in 24/24 patients. In 16 patients (16/40), post-interventional image fusion with CEUS demonstrated successful treatment, but follow-up CT detected residual viable tumor (6/16). Non-enhanced CT did not identify any case of treatment failure. Image fusion with CEUS assessed treatment efficacy with a specificity of 100%, sensitivity of 80% and a positive predictive value of 1 (negative predictive value 0.63). CONCLUSIONS Image fusion of CEUS with CT allows a reliable, highly specific post-interventional evaluation of embolization response with good sensitivity without any further radiation exposure. It can detect residual viable tumor at early state, resulting in a close patient monitoring or re-therapy.
Resumo:
OBJECTIVES We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.
Resumo:
OBJECTIVE In 2013, Mozambique adopted Option B+, universal lifelong antiretroviral therapy (ART) for all pregnant and lactating women, as national strategy for prevention of mother-to-child transmission of HIV. We analyzed retention in care of pregnant and lactating women starting Option B+ in rural northern Mozambique. METHODS We compared ART outcomes in pregnant ("B+pregnant"), lactating ("B+lactating") and non-pregnant-non-lactating women of childbearing age starting ART after clinical and/or immunological criteria ("own health") between July 2013 and June 2014. Lost to follow-up was defined as no contact >180 days after the last visit. Multivariable competing risk models were adjusted for type of facility (type 1 vs. peripheral type 2 health center), age, WHO stage and time from HIV diagnosis to ART. RESULTS Over 333 person-years of follow-up (of 243 "B+pregnant", 65″B+lactating" and 317 "own health" women), 3.7% of women died and 48.5% were lost to follow-up. "B+pregnant" and "B+lactating" women were more likely to be lost in the first year (57% vs. 56.9% vs. 31.6%; p<0.001) and to have no follow-up after the first visit (42.4% vs. 29.2% vs. 16.4%; p<0.001) than "own health" women. In adjusted analyses, risk of being lost to follow-up was higher in "B+pregnant" (adjusted subhazard ratio [asHR]: 2.77; 95% CI: 2.18-3.50; p<0.001) and "B+lactating" (asHR: 1.94; 95% CI: 1.37-2.74; p<0.001). Type 2 health center was the only additional significant risk factor for loss to follow-up. CONCLUSIONS Retaining pregnant and lactating women in option B+ ART was poor; losses to follow-up were mainly early. The success of Option B+ for prevention of mother-to-child transmission of HIV in rural settings with weak health systems will depend on specific improvements in counseling and retention measures, especially at the beginning of treatment. This article is protected by copyright. All rights reserved.
Resumo:
Research examining programs designed to retain patients in health care focus on repeated interactions between outreach workers and patients (Bradford et al. 2007; Cheever 2007). The purpose of this study was to determine if patients who are peer-mentored at their intake exam remain in care longer and attend more physicians' visits than those who were not mentored. Using patients' medical records and a previously created mentor database, the study determined how many patients attended their intake visit but subsequently failed to establish regular care. The cohort study examined risk factors for establishing care, determined if patients lacking a peer mentor failed to establish care more than peer mentor assisted patients, and subsequently if peer mentored patients had better health outcomes. The sample consists of 1639 patients who were entered into the Thomas Street Patient Mentor Database between May 2005 and June 2007. The assignment to the mentored group was haphazardly conducted based on mentor availability. The data from the Mentor Database was then analyzed using descriptive statistical software (SPSS version 15; SPSS Inc., Chicago, Illinois, USA). Results indicated that patients who had a mentor at intake were more likely to return for primary care HIV visits at 90 and 180 days. Mentored patients also were more likely to be prescribed ART within 180 days from intake. Other risk factors that impacted remaining in care included gender, previous care status, time from diagnosis to intake visit, and intravenous drug use. Clinical health outcomes did not differ significantly between groups. This supports that mentoring did improve outcomes. Continuing to use peer-mentoring programs for HIV care may help in increasing retention of patients in care and improving patients' health in a cost effective manner. Future research on the effects of peer mentoring on mentors, and effects of concordance of mentor and patient demographics may help to further improve peer-mentoring programs. ^
Resumo:
In prospective studies it is essential that the study sample accurately represents the target population for meaningful inferences to be drawn. Understanding why some individuals do not participate, or fail to continue to participate, in longitudinal studies can provide an empirical basis for the development of effective recruitment and retention strategies to improve response rates. This study examined the influence of social connectedness and self-esteem on long-term retention of participants, using secondary data from the “San Antonio Longitudinal Study of Aging” (SALSA), a population-based study of Mexican Americans (MAs) and European Americans (EAs) aged over 65 years residing in San Antonio, Texas. We tested the effect of social connectedness, self-esteem and socioeconomic status on participant retention in both ethnic groups. In MAs only, we analyzed whether acculturation and assimilation moderated these associations and/or had a direct effect on participant retention. ^ Low income, low frequency of social contacts and length of recruitment interval were significant predictors of non-completer status. Participants with low levels of social contacts were almost twice as likely as those with high levels of social contacts to be non-completers, even after adjustment for age, sex, ethnic group, education, household income, and recruitment interval (OR = 1.95, 95% CI: 1.26–3.01, p = 0.003). Recruitment interval consistently and strongly predicted non-completer status in all the models tested. Depending on the model, for each year beyond baseline there was a 25–33% greater likelihood of non-completion. The only significant interaction, or moderating, effect observed was between social contacts and cultural values among MAs. Specifically, MAs with both low social contacts and low acculturation on cultural values (i.e., placed high value on preserving Mexican cultural origins) were three and half times more likely to be non-completers compared with MAs in other subgroups comprised of the combination of these variables, even after adjustment for covariates. ^ Long term studies with older and minority participants are challenging for participant retention. Strategies can be designed to enhance retention by paying special attention to participants with low social contacts and, in MAs, participants with both low social contacts and low acculturation on cultural values. Minimizing the time interval between baseline and follow-up recruitment, and maintaining frequent contact with participants during this interval should also be is integral to the study design.^
Resumo:
Lost to follow up (LTFU) in the care and treatment of HIV/AIDS represents a particularly problematic aspect when evaluating the success of treatment programs. Identifying modifiable factors that lead to LTFU would be important if we are to design effective retention interventions. The purpose of this study was to identify the challenges faced by children seeking care and treatment at a large HIV Clinic in Botswana. In order to identify those factors, we used mixed methods from different sources of information available at the Baylor Clinic. The first method involved a case-control study through which we interviewed a select representation of children 1-18 years who, at some point in time, have attended clinic at Baylor Clinic in Gaborone, Botswana. We document this in detail using the first journal article. We defined LTFU as patients who had not attended clinic for more than 6 months at the onset of the study; the comparison group was recruited from among those who have attended clinic at any point in the 6 months leading to the start of study. Factors were compared between the cases and controls. The second methodology involved conducting in-depth interviews with health providers to elicit their opinions and experiences dealing with patients at the at the Baylor clinic in general and the LTFU patients in particular. We document this methodology and its findings in the second journal article. ^ We found that most patients that are LTFU failed to engage with the clinic. Most of the LTFU made only one visit to the clinic (47.66%) as compared to less than 1% in the control group (P<0.01, 2-tailed Fisher's exact test). Among the interviewed patients, psychosocial factors such as stigma, religious beliefs, child rebellion and disclosure of HIV status concerns were characteristic of the LTFU population, but psychosocial issues were not cited among the comparison group. We also found that these psychosocial aspects of the patients point towards a bigger problem of mental health that needs to be addressed. Socioeconomic factors such as lack of transport, school-related activities and forgetting check-up dates were cited predominantly by the controls than cases. ^ From these findings, there is need to target interventions towards engaging pediatric patients at their initial clinic visit. Such interventions would focus on psychosocial support, as well as involving faith-based organizations in planning joint responses.^
Resumo:
Globalization has resulted in unprecedented movements of people, goods, and alien species across the planet. Although the impacts of biological invasions are widely appreciated, a bias exists in research effort to post-dispersal processes because of the difficulties of measuring propagule pressure. The Antarctic provides an ideal model system in which to investigate propagule movements because of the region's isolation and small number of entry routes. Here we investigated the logistics operations of the South African National Antarctic Programme (SANAP) and quantified the initial dispersal of alien species into the region. we found that over 1400 seeds from 99 taxa are transported into the Antarctic each field season in association with SANAP passenger luggage and cargo. The first ever assessment of propagule drop-off indicated that 30-50% of these propagules will enter the recipient environment. Many of the taxa include cosmopolitan weeds and known aliens in the Antarctic, indicating that logistics operations form part of a globally self-perpetuating cycle moving alien species between areas of human disturbance. in addition, propagules of some taxa native to the Antarctic region were also found, suggesting that human movements may be facilitating intra-regional homogenization. Several relatively simple changes in biosecurity policy that could significantly reduce the threat of introduction of nonnative species are suggested.
Resumo:
This data set contains a time series of plant height measurements (vegetative and reproductive) from the main experiment plots of a large grassland biodiversity experiment (the Jena Experiment; see further details below). In addition, data on species specific plant heights for the main experiment are available from 2002. In the main experiment, 82 grassland plots of 20 x 20 m were established from a pool of 60 species belonging to four functional groups (grasses, legumes, tall and small herbs). In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 4, 8, 16 and 60 species) and functional richness (1, 2, 3, 4 functional groups). Plots were maintained by bi-annual weeding and mowing. 1. Plant height was recorded, generally, twice a year just before biomass harvest (during peak standing biomass in late May and in late August). Methodologies of measuring height have varied somewhat over the years. In earlier year the streched plant height was measured, while in later years the standing height without streching the plant was measured. Vegetative height was measured either as the height of the highest leaf or as the length of the main axis of non-flowering plants. Regenerating height was measured either as the height of the highest flower on a plant or as the height of the main axis of flowering. Sampled plants were either randomly selected in the core area of plots or along transects in defined distances. For details refer to the description of individual years. Starting in 2006, also the plots of the management experiment, that altered mowing frequency and fertilized subplots (see further details in the general description of the Jena Experiment) were sampled. 2. Species specific plant height was recorded two times in 2002: in late July (vegetative height) and just before biomass harvest during peak standing biomass in late August (vegetative and regenerative height). For each plot and each sown species in the species pool, 3 plant individuals (if present) from the central area of the plots were randomly selected and used to measure vegetative height (non-flowering indviduals) and regenerative height (flowering individuals) as stretched height. Provided are the means over the three measuremnts per plant species per plot.