952 resultados para Time Period


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES This study sought to report the final 5 years follow-up of the landmark LEADERS (Limus Eluted From A Durable Versus ERodable Stent Coating) trial. BACKGROUND The LEADERS trial is the first randomized study to evaluate biodegradable polymer-based drug-eluting stents (DES) against durable polymer DES. METHODS The LEADERS trial was a 10-center, assessor-blind, noninferiority, "all-comers" trial (N = 1,707). All patients were centrally randomized to treatment with either biodegradable polymer biolimus-eluting stents (BES) (n = 857) or durable polymer sirolimus-eluting stents (SES) (n = 850). The primary endpoint was a composite of cardiac death, myocardial infarction (MI), or clinically indicated target vessel revascularization within 9 months. Secondary endpoints included extending the primary endpoint to 5 years and stent thrombosis (ST) (Academic Research Consortium definition). Analysis was by intention to treat. RESULTS At 5 years, the BES was noninferior to SES for the primary endpoint (186 [22.3%] vs. 216 [26.1%], rate ratio [RR]: 0.83 [95% confidence interval (CI): 0.68 to 1.02], p for noninferiority <0.0001, p for superiority = 0.069). The BES was associated with a significant reduction in the more comprehensive patient-orientated composite endpoint of all-cause death, any MI, and all-cause revascularization (297 [35.1%] vs. 339 [40.4%], RR: 0.84 [95% CI: 0.71 to 0.98], p for superiority = 0.023). A significant reduction in very late definite ST from 1 to 5 years was evident with the BES (n = 5 [0.7%] vs. n = 19 [2.5%], RR: 0.26 [95% CI: 0.10 to 0.68], p = 0.003), corresponding to a significant reduction in ST-associated clinical events (primary endpoint) over the same time period (n = 3 of 749 vs. n = 14 of 738, RR: 0.20 [95% CI: 0.06 to 0.71], p = 0.005). CONCLUSIONS The safety benefit of the biodegradable polymer BES, compared with the durable polymer SES, was related to a significant reduction in very late ST (>1 year) and associated composite clinical outcomes. (Limus Eluted From A Durable Versus ERodable Stent Coating [LEADERS] trial; NCT00389220).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The response of atmospheric chemistry and dynamics to volcanic eruptions and to a decrease in solar activity during the Dalton Minimum is investigated with the fully coupled atmosphere–ocean chemistry general circulation model SOCOL-MPIOM (modeling tools for studies of SOlar Climate Ozone Links-Max Planck Institute Ocean Model) covering the time period 1780 to 1840 AD. We carried out several sensitivity ensemble experiments to separate the effects of (i) reduced solar ultra-violet (UV) irradiance, (ii) reduced solar visible and near infrared irradiance, (iii) enhanced galactic cosmic ray intensity as well as less intensive solar energetic proton events and auroral electron precipitation, and (iv) volcanic aerosols. The introduced changes of UV irradiance and volcanic aerosols significantly influence stratospheric dynamics in the early 19th century, whereas changes in the visible part of the spectrum and energetic particles have smaller effects. A reduction of UV irradiance by 15%, which represents the presently discussed highest estimate of UV irradiance change caused by solar activity changes, causes global ozone decrease below the stratopause reaching as much as 8% in the midlatitudes at 5 hPa and a significant stratospheric cooling of up to 2 °C in the mid-stratosphere and to 6 °C in the lower mesosphere. Changes in energetic particle precipitation lead only to minor changes in the yearly averaged temperature fields in the stratosphere. Volcanic aerosols heat the tropical lower stratosphere, allowing more water vapour to enter the tropical stratosphere, which, via HOx reactions, decreases upper stratospheric and mesospheric ozone by roughly 4%. Conversely, heterogeneous chemistry on aerosols reduces stratospheric NOx, leading to a 12% ozone increase in the tropics, whereas a decrease in ozone of up to 5% is found over Antarctica in boreal winter. The linear superposition of the different contributions is not equivalent to the response obtained in a simulation when all forcing factors are applied during the Dalton Minimum (DM) – this effect is especially well visible for NOx/NOy. Thus, this study also shows the non-linear behaviour of the coupled chemistry-climate system. Finally, we conclude that especially UV and volcanic eruptions dominate the changes in the ozone, temperature and dynamics while the NOx field is dominated by the energetic particle precipitation. Visible radiation changes have only very minor effects on both stratospheric dynamics and chemistry.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The time period covered by this bibliography is circa 1960 to 1974, an unusual "decade," to be sure. But because the Civil Rights movements and Watergate framed this era, it became impossible to compile a bibliography on the 1960s without entering the 1970s. The Vietnam War did not end until 1975, which would seem a logical place to end the decade. However, Nixon's resignation sparked more public interest and energy than did the anti-climax of the end of the war. Thus, the time period covered. Although there are some key books on the Civil Rights Movement, which started in the late 1950s, that movement has been adequately covered in other resources.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE The aim of our investigation was to review the implementation of a comprehensive tobacco dependence education (TDE) curriculum at the Medi School of Dental Hygiene (MSDH), Bern, Switzerland, 2001-2008. METHODS In 2001, new forms to record patients' tobacco use history and willingness to quit were created for all the MSDH patients. In 2002, a new theoretically based tobacco dependence treatment protocol was implemented into the MSDH curriculum. Students received instruction on how to provide brief tobacco use dependence interventions as well as maintain detailed records of patient tobacco use and cessation interventions for every smoker at all dental hygiene visits. RESULTS In 2002, 17 lecture hours were added to the following subjects: pathology, periodontology, preventive dentistry, pharmacology and psychology. During the same time period, 2213 patients (56.9% women) have visited the MSDH. Smoking status was recorded in 85.7% of all the patients (30.2% smokers). Brief tobacco use interventions were recorded in 36.8% of all smokers while 7.6% of these have reported to quit smoking. CONCLUSIONS Overall, the new TDE curriculum was successfully implemented and accepted by the MSDH faculty. Applications in the clinical practice, however, may still be improved to better identify smokers and increase initial and follow-up interventions potentially leading to higher quit rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Few studies have addressed the interaction between instruction content and saccadic eye movement control. To assess the impact of instructions on top-down control, we instructed 20 healthy volunteers to deliberately delay saccade triggering, to make inaccurate saccades or to redirect saccades--i.e. to glimpse towards and then immediately opposite to the target. Regular pro- and antisaccade tasks were used for comparison. Bottom-up visual input remained unchanged and was a gap paradigm for all instructions. In the inaccuracy and delay tasks, both latencies and accuracies were detrimentally impaired by either type of instruction and the variability of latency and accuracy was increased. The intersaccadic interval (ISI) required to correct erroneous antisaccades was shorter than the ISI for instructed direction changes in the redirection task. The word-by-word instruction content interferes with top-down saccade control. Top-down control is a time consuming process, which may override bottom-up processing only during a limited time period. It is questionable whether parallel processing is possible in top-down control, since the long ISI for instructed direction changes suggests sequential planning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Melanoma is the most common oral tumor in dogs, characterized by rapid growth, local invasion, and high metastatic rate. The goal of this study was to evaluate the combination of radiation therapy and DNA tumor vaccine. We hypothesized, that the concurrent use would not increase toxicity. Nine dogs with oral melanoma were treated with 4 fractions of 8 Gray at 7-day intervals. The vaccine was given 4 times every 14 days, beginning at the first radiation fraction. Local acute radiation toxicities were assessed according to the VRTOG toxicity scoring scheme over a time period of 7 weeks. In none of the evaluated dogs, mucositis, dermatitis and conjunctivitis exceeded grade 2. In 3 dogs mild fever, lethargy, and local swelling at the injection site were seen after vaccine application. In conclusion, the concurrent administration of radiation therapy and vaccine was well tolerated in all dogs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Upper-air observations are a fundamental data source for global atmospheric data products, but uncertainties, particularly in the early years, are not well known. Most of the early observations, which have now been digitized, are prone to a large variety of undocumented uncertainties (errors) that need to be quantified, e.g., for their assimilation in reanalysis projects. We apply a novel approach to estimate errors in upper-air temperature, geopotential height, and wind observations from the Comprehensive Historical Upper-Air Network for the time period from 1923 to 1966. We distinguish between random errors, biases, and a term that quantifies the representativity of the observations. The method is based on a comparison of neighboring observations and is hence independent of metadata, making it applicable to a wide scope of observational data sets. The estimated mean random errors for all observations within the study period are 1.5 K for air temperature, 1.3 hPa for pressure, 3.0 ms−1for wind speed, and 21.4° for wind direction. The estimates are compared to results of previous studies and analyzed with respect to their spatial and temporal variability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The city of Bath is a World Heritage site and its thermal waters, the Roman Baths and new spa development rely on undisturbed flow of the springs (45 °C). The current investigations provide an improved understanding of the residence times and flow regime as basis for the source protection. Trace gas indicators including the noble gases (helium, neon, argon, krypton and xenon) and chlorofluorocarbons (CFCs), together with a more comprehensive examination of chemical and stable isotope tracers are used to characterise the sources of the thermal water and any modern components. It is shown conclusively by the use of 39Ar that the bulk of the thermal water has been in circulation within the Carboniferous Limestone for at least 1000 years. Other stable isotope and noble gas measurements confirm previous findings and strongly suggest recharge within the Holocene time period (i.e. the last 12 kyr). Measurements of dissolved 85Kr and chlorofluorocarbons constrain previous indications from tritium that a small proportion (<5%) of the thermal water originates from modern leakage into the spring pipe passing through Mesozoic valley fill underlying Bath. This introduces small amounts of O2 into the system, resulting in the Fe precipitation seen in the King’s Spring. Silica geothermometry indicates that the water is likely to have reached a maximum temperature of between 69–99 °C, indicating a most probable maximum circulation depth of ∼3 km, which is in line with recent geological models. The rise to the surface of the water is sufficiently indirect that a temperature loss of >20 °C is incurred. There is overwhelming evidence that the water has evolved within the Carboniferous Limestone formation, although the chemistry alone cannot pinpoint the geometry of the recharge area or circulation route. For a likely residence time of 1–12 kyr, volumetric calculations imply a large storage volume and circulation pathway if typical porosities of the limestone at depth are used, indicating that much of the Bath-Bristol basin must be involved in the water storage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recovery of a 1.5 million yr long ice core from Antarctica represents a keystone of our understanding of Quaternary climate, the progression of glaciation over this time period and the role of greenhouse gas cycles in this progression. Here we tackle the question of where such ice may still be found in the Antarctic ice sheet. We can show that such old ice is most likely to exist in the plateau area of the East Antarctic ice sheet (EAIS) without stratigraphic disturbance and should be able to be recovered after careful pre-site selection studies. Based on a simple ice and heat flow model and glaciological observations, we conclude that positions in the vicinity of major domes and saddle position on the East Antarctic Plateau will most likely have such old ice in store and represent the best study areas for dedicated reconnaissance studies in the near future. In contrast to previous ice core drill site selections, however, we strongly suggest significantly reduced ice thickness to avoid bottom melting. For example for the geothermal heat flux and accumulation conditions at Dome C, an ice thickness lower than but close to about 2500 m would be required to find 1.5 Myr old ice (i.e., more than 700 m less than at the current EPICA Dome C drill site). Within this constraint, the resolution of an Oldest-Ice record and the distance of such old ice to the bedrock should be maximized to avoid ice flow disturbances, for example, by finding locations with minimum geothermal heat flux. As the geothermal heat flux is largely unknown for the EAIS, this parameter has to be carefully determined beforehand. In addition, detailed bedrock topography and ice flow history has to be reconstructed for candidates of an Oldest-Ice ice coring site. Finally, we argue strongly for rapid access drilling before any full, deep ice coring activity commences to bring datable samples to the surface and to allow an age check of the oldest ice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Greenland NEEM (North Greenland Eemian Ice Drilling) operation in 2010 provided the first opportunity to combine trace-gas measurements by laser spectroscopic instruments and continuous-flow analysis along a freshly drilled ice core in a field-based setting. We present the resulting atmospheric methane (CH4) record covering the time period from 107.7 to 9.5 ka b2k (thousand years before 2000 AD). Companion discrete CH4 measurements are required to transfer the laser spectroscopic data from a relative to an absolute scale. However, even on a relative scale, the high-resolution CH4 data set significantly improves our knowledge of past atmospheric methane concentration changes. New significant sub-millennial-scale features appear during interstadials and stadials, generally associated with similar changes in water isotopic ratios of the ice, a proxy for local temperature. In addition to the midpoint of Dansgaard–Oeschger (D/O) CH4 transitions usually used for cross-dating, sharp definition of the start and end of these events brings precise depth markers (with ±20 cm uncertainty) for further cross-dating with other palaeo- or ice core records, e.g. speleothems. The method also provides an estimate of CH4 rates of change. The onsets of D/O events in the methane signal show a more rapid rate of change than their endings. The rate of CH4 increase associated with the onsets of D/O events progressively declines from 1.7 to 0.6 ppbv yr−1 in the course of marine isotope stage 3. The largest observed rate of increase takes place at the onset of D/O event #21 and reaches 2.5 ppbv yr−1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Context: Information currently available on the trafficking of minors in the U.S. for commercial sexual exploitation includes approximations of the numbers involved, risk factors that increase the likelihood of victimization and methods of recruitment and control. However, specific characteristics about this vulnerable population remain largely unknown. Objective: This article has two distinct purposes. The first is to provide the reader with an overview of available information on minor sex trafficking in the U.S. The second is to present findings and discuss policy, research, and educational implications from secondary data analysis of 115 cases of minor sex trafficking in the U.S. Design: Minor sex trafficking cases were identified through two main venues - a review of U.S. Department of Justice press releases of human trafficking cases and an online search of media reports. Searches covered the time period from October 28, 2000, which coincided with the passage of the VTVPA through October 31, 2009. Cases were included in analysis if the incident involved at least one victim under the age of 18, occurred in the U.S., and at least one perpetrator had been arrested, indicted, or convicted. Results: A total of 115 separate incidents involving at least 153 victims were located. These occurrences involved 215 perpetrators, with the majority of them having been convicted (n = 117, 53.4%), The number of victims involved in a single incident ranged from 1 to 9. Over 90% of victims were female who ranged in age from 5 to 17 years. There were more U.S. minor victims than those from other countries. Victims had been in captivity from less than 6 months to 5 years. Minors most commonly fell into exploitation through some type of false promise indicated (16.3%, n = 25), followed by kidnapping (9.8%, n = 15). Over a fifth of the sample (22.2%, n = 34) were abused through two commercial sex practices, with almost all (94.1%, n = 144) used in prostitution. One of every five victims (24.8%, n = 38) had been advertised on an Internet website. Conclusions: Results of a review of known information about minor sex trafficking and findings from analysis of 115 incidents of the sex trafficking of youth in the U.S. indicate a need for stronger legislation to educate various professional groups, more comprehensive services for victims, stricter laws for pimps and traffickers, and preventive educational interventions beginning at a young age.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Visual short-term memory (VSTM) is the storage of visual information over a brief time period (usually a few seconds or less). Over the past decade, the most popular task for studying VSTM in humans has been the change detection task. In this task, subjects must remember several visual items per trial in order to identify a change following a brief delay interval. Results from change detection tasks have shown that VSTM is limited; humans are only able to accurately hold a few visual items in mind over a brief delay. However, there has been much debate in regard to the structure or cause of these limitations. The two most popular conceptualizations of VSTM limitations in recent years have been the fixed-capacity model and the continuous-resource model. The fixed-capacity model proposes a discrete limit on the total number of visual items that can be stored in VSTM. The continuous-resource model proposes a continuous-resource that can be allocated among many visual items in VSTM, with noise in item memory increasing as the number of items to be remembered increases. While VSTM is far from being completely understood in humans, even less is known about VSTM in non-human animals, including the rhesus monkey (Macaca mulatta). Given that rhesus monkeys are the premier medical model for humans, it is important to understand their VSTM if they are to contribute to understanding human memory. The primary goals of this study were to train and test rhesus monkeys and humans in change detection in order to directly compare VSTM between the two species and explore the possibility that direct species comparison might shed light on the fixed-capacity vs. continuous-resource models of VSTM. The comparative results suggest qualitatively similar VSTM for the two species through converging evidence supporting the continuous-resource model and thereby establish rhesus monkeys as a good system for exploring neurophysiological correlates of VSTM.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of the study was to evaluate in vitro calcification potential among liposomes composed of phospholipids with variations in fatty acid chains and polar head groups. The liposome was also modified by utilizing mixed phospholipids, incorporation of different types of protein to the liposome, or complexing with various collagen preparations. The samples were then incubated in a metastable calcium phosphate solution for the proposed time period. Calcium and phosphate uptake were measured. Resulting precipitates were processed for x-ray diffraction and electron microscopy. Acidic phospholipid, Dioleoylphosphatidic acid and mixed phospholipids, Dioleoylphosphatidic acid/Dipalmitoylphosphatidylethanolamine liposomes calcified at a faster rate and to a greater degree than other phospholipids tested. The incorporation of polylysine, fibronectin, bone protein, or the complexing with collagen decreased the rate and amount of calcification. Electron microscopy demonstrated the similarity of the calcified collagen-liposome complex to the natural calcification matrix. These preparations may be used as a model to study the role of membrane lipids and collagen-phospholipid during the process of calcification.^ The in vivo study was designed to determine whether the potential existed for the promotion of bone healing by the synthetic liposome-collagen complex. The implant materials were modified to provide decreased antigenicity, biocompatability while maintaining their bone conduction properties. The samples were placed subcutaneously and/or subperiosteally and/or in 8 mm calvarium defects of adult rats. Histological and immunological studies demonstrated that the implant itself retained minimal antigenicity and did not inhibit bone formation. However, modification of the implant may contain the bone induction property and be utilized to stimulate bony healing. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gastrin-releasing peptide (GRP) and other bombesin-like peptides stimulate hormone secretion and cell proliferation by binding to specific G-protein-coupled receptors. Three studies were performed to identify potential mechanisms involved in GRP/bombesin receptor regulation.^ Although bombesin receptors are localized throughout the gastrointestinal tract, few gastrointestinal cell lines are available to study bombesin action. In the first study, the binding and function of bombesin receptors in the human HuTu-80 duodenal cancer cell line were characterized. ($\sp{125}$I-Tyr$\sp4$) bombesin bound with high affinity to a GRP-preferring receptor. Bombesin treatment increased IP$\sb3$ production, but had no effect on cell proliferation. Similar processing of ($\sp{125}$I-Tyr$\sp4$) bombesin and of GRP-receptors was observed in HuTu-80 cells and Swiss 3T3 fibroblasts, a cell line which mitogenically responds to bombesin. Therefore, the lack of a bombesin mitogenic effect in HuTu-80 cells is not due to unusual processing of ($\sp{125}$I-Tyr$\sp4$) bombesin or rapid GRP-receptor down-regulation.^ In the second study, a bombesin antagonist was developed to study the processing and regulatory events after antagonist binding. As previously shown, receptor bound agonist, ($\sp{125}$I-Tyr$\sp4$) bombesin, was rapidly internalized and degraded in chloroquine-sensitive compartments. Interestingly, receptor-bound antagonist, ($\sp{125}$I-D-Tyr$\sp6$) bombesin(6-13)PA was not internalized, but degraded at the cell-surface. In contrast to bombesin, (D-Tyr$\sp6$) bombesin(6-13)PA treatment did not cause receptor internalization. Together these results demonstrate that receptor regulation and receptor-mediated processing of antagonist is different from that of agonist.^ Bombesin receptors undergo acute desensitization. By analogy to other G-protein-coupled receptors, a potential desensitization mechanism may involve receptor phosphorylation. In the final study, $\sp{32}$P-labelled Swiss 3T3 fibroblasts and CHO-mBR1 cells were treated with bombesin and the GRP-receptor was immunoprecipitated. In both cell lines, bombesin treatment markedly stimulated GRP-receptor phosphorylation. Furthermore, bombesin-stimulated GRP-receptor phosphorylation occurred within the same time period as bombesin-stimulated desensitization, demonstrating that these two processes are correlated.^ In conclusion, these studies of GRP-receptor regulation further our understanding of bombesin action and provide insight into G-protein-coupled receptor regulation in general. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this study is to identify the relationship between population density and the initial stages of the spread of disease in a local population. This study proposes to concentrate on the question of how population density affects the distribution of the susceptible individuals in a local population and thus affects the spread of the disease, measles. Population density is measured by the average of the number of contacts with susceptible individuals by each individual in the population during a fixed-length time period. The term “contact with susceptible individuals” means sufficient contact between two people for the disease to pass from an infectious person to a susceptible person. The fixed-length time period is taken to be the average length of time an infected person is infectious without symptoms of the disease. For this study of measles, the time period will be seven days. ^ While much attention has been given to modeling the entire epidemic process of measles, attempts have not been made to study the characteristics of contact rates required to initiate an epidemic. This study explores the relationship between population density, given a specific herd immunity rate in the population, and initial rate of the spread of the disease by considering the underlying distribution of contacts with susceptibles by the individuals in the population. ^ This study does not seek to model an entire measles epidemic, but to model the above stated relationship for the local population within which the first infective person is introduced. This study describes the mathematical relationship between population density parameters and contact distribution parameters. ^ The results are displayed in graphs that show the effects of different population densities on the spread of disease. The results support the idea that the number of new infectives is strongly related to the distribution of susceptible contacts. The results also show large differences in the epidemic measures between populations with densities equal to four versus three. ^