29 resultados para CDMA CAPACITY ANALYSIS
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
PURPOSE: Two noninvasive methods to measure dental implant stability are damping capacity assessment (Periotest) and resonance frequency analysis (Osstell). The objective of the present study was to assess the correlation of these 2 techniques in clinical use. MATERIALS AND METHODS: Implant stability of 213 clinically stable loaded and unloaded 1-stage implants in 65 patients was measured in triplicate by means of resonance frequency analysis and Periotest. Descriptive statistics as well as Pearson's, Spearman's, and intraclass correlation coefficients were calculated with SPSS 11.0.2. RESULTS: The mean values were 57.66 +/- 8.19 implant stability quotient for the resonance frequency analysis and -5.08 +/- 2.02 for the Periotest. The correlation of both measuring techniques was -0.64 (Pearson) and -0.65 (Spearman). The single-measure intraclass correlation coefficients for the ISQ and Periotest values were 0.99 and 0.88, respectively (95% CI). No significant correlation of implant length with either resonance frequency analysis or Periotest could be found. However, a significant correlation of implant diameter with both techniques was found (P < .005). The correlation of both measuring systems is moderate to good. It seems that the Periotest is more susceptible to clinical measurement variables than the Osstell device. The intraclass correlation indicated lower measurement precision for the Periotest technique. Additionally, the Periotest values differed more from the normal (Gaussian) curve of distribution than the ISQs. Both measurement techniques show a significant correlation to the implant diameter. CONCLUSION: Resonance frequency analysis appeared to be the more precise technique.
Resumo:
Excessive consumption of acidic drinks and foods contributes to tooth erosion. The aims of the present in vitro study were twofold: (1) to assess the erosive potential of different dietary substances and medications; (2) to determine the chemical properties with an impact on the erosive potential. We selected sixty agents: soft drinks, an energy drink, sports drinks, alcoholic drinks, juice, fruit, mineral water, yogurt, tea, coffee, salad dressing and medications. The erosive potential of the tested agents was quantified as the changes in surface hardness (ΔSH) of enamel specimens within the first 2 min (ΔSH2-0 = SH2 min - SHbaseline) and the second 2 min exposure (ΔSH4-2 = SH4 min - SH2 min). To characterise these agents, various chemical properties, e.g. pH, concentrations of Ca, Pi and F, titratable acidity to pH 7·0 and buffering capacity at the original pH value (β), as well as degree of saturation (pK - pI) with respect to hydroxyapatite (HAP) and fluorapatite (FAP), were determined. Erosive challenge caused a statistically significant reduction in SH for all agents except for coffee, some medications and alcoholic drinks, and non-flavoured mineral waters, teas and yogurts (P < 0·01). By multiple linear regression analysis, 52 % of the variation in ΔSH after 2 min and 61 % after 4 min immersion were explained by pH, β and concentrations of F and Ca (P < 0·05). pH was the variable with the highest impact in multiple regression and bivariate correlation analyses. Furthermore, a high bivariate correlation was also obtained between (pK - pI)HAP, (pK - pI)FAP and ΔSH.
Resumo:
Background Idiopathic pulmonary fibrosis is a progressive and fatal lung disease with inevitable loss of lung function. The CAPACITY programme (studies 004 and 006) was designed to confirm the results of a phase 2 study that suggested that pirfenidone, a novel antifibrotic and anti-inflammatory drug, reduces deterioration in lung function in patients with idiopathic pulmonary fibrosis. Methods In two concurrent trials (004 and 006), patients (aged 40–80 years) with idiopathic pulmonary fibrosis were randomly assigned to oral pirfenidone or placebo for a minimum of 72 weeks in 110 centres in Australia, Europe, and North America. In study 004, patients were assigned in a 2:1:2 ratio to pirfenidone 2403 mg/day, pirfenidone 1197 mg/day, or placebo; in study 006, patients were assigned in a 1:1 ratio to pirfenidone 2403 mg/day or placebo. The randomisation code (permuted block design) was computer generated and stratified by region. All study personnel were masked to treatment group assignment until after final database lock. Treatments were administered orally, 801 mg or 399 mg three times a day. The primary endpoint was change in percentage predicted forced vital capacity (FVC) at week 72. Analysis was by intention to treat. The studies are registered with ClinicalTrials.gov, numbers NCT00287729 and NCT00287716. Findings In study 004, 174 of 435 patients were assigned to pirfenidone 2403 mg/day, 87 to pirfenidone 1197 mg/day, and 174 to placebo. In study 006, 171 of 344 patients were assigned to pirfenidone 2403 mg/day, and 173 to placebo. All patients in both studies were analysed. In study 004, pirfenidone reduced decline in FVC (p=0·001). Mean FVC change at week 72 was −8·0% (SD 16·5) in the pirfenidone 2403 mg/day group and −12·4% (18·5) in the placebo group (difference 4·4%, 95% CI 0·7 to 9·1); 35 (20%) of 174 versus 60 (35%) of 174 patients, respectively, had a decline of at least 10%. A significant treatment effect was noted at all timepoints from week 24 and in an analysis over all study timepoints (p=0·0007). Mean change in percentage FVC in the pirfenidone 1197 mg/day group was intermediate to that in the pirfenidone 2403 mg/day and placebo groups. In study 006, the difference between groups in FVC change at week 72 was not significant (p=0·501). Mean change in FVC at week 72 was −9·0% (SD 19·6) in the pirfenidone group and −9·6% (19·1) in the placebo group, and the difference between groups in predicted FVC change at week 72 was not significant (0·6%, −3·5 to 4·7); however, a consistent pirfenidone effect was apparent until week 48 (p=0·005) and in an analysis of all study timepoints (p=0·007). Patients in the pirfenidone 2403 mg/day group had higher incidences of nausea (125 [36%] of 345 vs 60 [17%] of 347), dyspepsia (66 [19%] vs 26 [7%]), vomiting (47 [14%] vs 15 [4%]), anorexia (37 [11%] vs 13 [4%]), photosensitivity (42 [12%] vs 6 [2%]), rash (111 [32%] vs 40 [12%]), and dizziness (63 [18%] vs 35 [10%]) than did those in the placebo group. Fewer overall deaths (19 [6%] vs 29 [8%]) and fewer deaths related to idiopathic pulmonary fibrosis (12 [3%] vs 25 [7%]) occurred in the pirfenidone 2403 mg/day groups than in the placebo groups. Interpretation The data show pirfenidone has a favourable benefit risk profile and represents an appropriate treatment option for patients with idiopathic pulmonary fibrosis.
Resumo:
Over the past decades, major progress in patient selection, surgical techniques and anaesthetic management have largely contributed to improved outcome in lung cancer surgery. The purpose of this study was to identify predictors of post-operative cardiopulmonary morbidity in patients with a forced expiratory volume in 1 s <80% predicted, who underwent cardiopulmonary exercise testing (CPET). In this observational study, 210 consecutive patients with lung cancer underwent CPET with completed data over a 9-yr period (2001-2009). Cardiopulmonary complications occurred in 46 (22%) patients, including four (1.9%) deaths. On logistic regression analysis, peak oxygen uptake (peak V'(O₂) and anaesthesia duration were independent risk factors of both cardiovascular and pulmonary complications; age and the extent of lung resection were additional predictors of cardiovascular complications, whereas tidal volume during one-lung ventilation was a predictor of pulmonary complications. Compared with patients with peak V'(O₂) >17 mL·kg⁻¹·min⁻¹, those with a peak V'(O₂) <10 mL·kg⁻¹·min⁻¹ had a four-fold higher incidence of cardiac and pulmonary morbidity. Our data support the use of pre-operative CPET and the application of an intra-operative protective ventilation strategy. Further studies should evaluate whether pre-operative physical training can improve post-operative outcome.
Resumo:
Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and time dimension. In this paper we present a performance comparison of three distinct scheduling schemes for LTE uplink with main focus on the impact of flow-level dynamics resulting from the random user behaviour. We apply a combined analytical/simulation approach which enables fast evaluation of flow-level performance measures. The results show that by considering flow-level dynamics we are able to observe performance trends that would otherwise stay hidden if only packet-level analysis is performed.
Resumo:
This study evaluated the correlation between three strip-type, colorimetric tests and two laboratory methods with respect to the analysis of salivary buffering. The strip-type tests were saliva-check buffer, Dentobuff strip and CRT(®) Buffer test. The laboratory methods included Ericsson's laboratory method and a monotone acid/base titration to create a reference scale for the salivary titratable acidity. Additionally, defined buffer solutions were prepared and tested to simulate the carbonate, phosphate and protein buffer systems of saliva. The correlation between the methods was analysed by the Spearman's rank test. Disagreement was detected between buffering capacity values obtained with three strip-type tests that was more pronounced in case of saliva samples with medium and low buffering capacities. All strip-type tests were able to assign the hydrogencarbonate, di-hydrogenphosphate and 0.1% protein buffer solutions to the correct buffer categories. However, at 0.6% total protein concentrations, none of the test systems worked accurately. Improvements are necessary for strip-type tests because of certain disagreement with the Ericsson's laboratory method and dependence on the protein content of saliva.
Resumo:
Dendritic cells (DC) represent a heterogeneous cell family of major importance for innate immune responses against pathogens and antigen presentation during infection, cancer, allergy and autoimmunity. The aim of the present study was to characterize canine DC generated in vitro with respect to their phenotype, responsiveness to toll-like receptor (TLR) ligands and T-cell stimulatory capacity. DC were derived from monocytes (MoDC) and from bone marrow hematopoietic cells cultured with either Flt3-ligand (FL-BMDC) or with GM-CSF (GM-BMDC). All three methods generated cells with typical DC morphology that expressed CD1c, CD11c and CD14, similar to macrophages. However, CD40 was only found on DC, CD206 on MPhi and BMDC, but not on monocytes and MoDC. CD1c was not found on monocytes but on all in vitro differentiated cells. FL-BMDC and GM-BMDC were partially positive for CD4 and CD8. CD45RA was expressed on a subset of FL-BMDC but not on MoDC and GM-BMDC. MoDC and FL-DC responded well to TLR ligands including poly-IC (TLR2), Pam3Cys (TLR3), LPS (TLR4) and imiquimod (TLR7) by up-regulating MHC II and CD86. The generated DC and MPhi showed a stimulatory capacity for lymphocytes, which increased upon maturation with LPS. Taken together, our results are the basis for further characterization of canine DC subsets with respect to their role in inflammation and immune responses.
Resumo:
OBJECTIVE: To identify markers associated with the chondrogenic capacity of expanded human articular chondrocytes and to use these markers for sorting of more highly chondrogenic subpopulations. METHODS: The chondrogenic capacity of chondrocyte populations derived from different donors (n = 21) or different clonal strains from the same cartilage biopsy specimen (n = 21) was defined based on the glycosaminoglycan (GAG) content of tissues generated using a pellet culture model. Selected cell populations were analyzed by microarray and flow cytometry. In some experiments, cells were sorted using antibodies against molecules found to be associated with differential chondrogenic capacity and again assessed in pellet cultures. RESULTS: Significance Analysis of Microarrays indicated that chondrocytes with low chondrogenic capacity expressed higher levels of insulin-like growth factor 1 and of catabolic genes (e.g., matrix metalloproteinase 2, aggrecanase 2), while chondrocytes with high chondrogenic capacity expressed higher levels of genes involved in cell-cell or cell-matrix interactions (e.g., CD49c, CD49f). Flow cytometry analysis showed that CD44, CD151, and CD49c were expressed at significantly higher levels in chondrocytes with higher chondrogenic capacity. Flow cytometry analysis of clonal chondrocyte strains indicated that CD44 and CD151 could also identify more chondrogenic clones. Chondrocytes sorted for brighter CD49c or CD44 signal expression produced tissues with higher levels of GAG per DNA (up to 1.4-fold) and type II collagen messenger RNA (up to 3.4-fold) than did unsorted cells. CONCLUSION: We identified markers that allow characterization of the capacity of monolayer-expanded chondrocytes to form in vitro cartilaginous tissue and enable enrichment for subpopulations with higher chondrogenic capacity. These markers might be used as a means to predict and possibly improve the outcome of cell-based cartilage repair techniques.
Resumo:
BACKGROUND: Outcome after lung transplantation (LTx) is affected by the onset of bronchiolitis obliterans syndrome (BOS) and lung function decline. Reduced health-related quality of life (HRQL) and physical mobility have been shown in patients developing BOS, but the impact on the capacity to walk is unknown. We aimed to compare the long-term HRQL and 6-minute walk test (6MWT) between lung recipients affected or not by BOS Grade > or =2. METHODS: Fifty-eight patients were prospectively followed for 5.6 +/- 2.9 years after LTx. Assessments included the St George's Respiratory Questionnaire (SGRQ) and the 6MWT, which were performed yearly. Moreover, clinical complications were recorded to estimate the proportion of the follow-up time lived without clinical intercurrences after transplant. Analyses were performed using adjusted linear regression and repeated-measures analysis of variance. RESULTS: BOS was a significant predictor of lower SGRQ scores (p < 0.01) and reduced time free of clinical complications (p = 0.001), but not of 6MWT distance (p = 0.12). At 7 years post-transplant, results were: 69.0 +/- 21.8% vs 86.9 +/- 5.6%, p < 0.05 (SGRQ); 58.5 +/- 21.6% vs 88.7 +/- 11.4%, p < 0.01 (proportion of time lived without clinical complications); and 82.2 +/- 10.9% vs 91.9 +/- 14.2%, p = 0.27 (percent of predicted 6MWT), respectively, for patients with BOS and without BOS. CONCLUSIONS: Despite significantly less time lived without clinical complications and progressive decline of self-reported health status, the capacity to walk of patients affected by BOS remained relatively stable over time. These findings may indicate that the development of moderate to severe BOS does not prevent lung recipients from walking independently and pursuing an autonomous life.
Resumo:
PURPOSE: We report the clinical, morphological, and ultrastructural findings of 13 consecutively explanted opacified Hydroview(R) (hydrogel) intraocular lenses (IOLs). Our purpose was to provide a comprehensive account on the possible factors involved in late postoperative opacification of these IOLs. PATIENTS AND METHODS: Thirteen consecutive opacified hydrogel IOLs (Hydroview H 60 M, Bausch ; Lomb) were explanted due to the significant visual impairment they caused. The IOLs underwent macroscopical examination, transmission electron microscopy (TEM), scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDS), and electrophoresis for protein detection. Three unused control Hydroview IOLs served for comparison. RESULTS: Macroscopical examination showed a diffuse or localized grey-whitish opacification within the IOL optic. TEM confirmed the presence of lesions inside the optic in all the explanted IOLs and revealed 3 patterns of deep deposits: a) diffuse, thick, granular, electron-dense ones; b) small, thin, lattice-like ones, with prominent electron-lucent areas; and c) elongated electron-dense formations surrounded by electron-lucent halos. SEM showed surface deposits on four IOLs. EDS revealed oxygen and carbon in all IOLs and documented calcium, phosphorus, silicon and/or iron in the deposits. Two of the patients with iron in their IOLs had eye surgery prior to their phacoemulsification. Iron correlated well with the second TEM pattern of deep lesions, whereas calcium with the third TEM pattern. No protein bands were detected on electrophoresis. Control lenses did not show any ultrastructural or chemical abnormality. CONCLUSIONS: The present study supports the presence of chemical alterations inside the polymer of the optic in late postoperative opacification of Hydroview IOLs. This opacification does not follow a unique pathway but may present under different ultrastructular patterns depending on the responsible factors. Mechanical stress during surgery may initiate a sequence of events where ions such as calcium, phosphorus, silicon, and/or iron, participate in a biochemical cascade that leads to gradual alteration of the polymer network. Intraocular inflammation due to previous operation may be a factor inducing opacification through increase of iron-binding capacity in the aqueous humour. Calcification accounts only partially for the opacification noted in this type of IOL.
Resumo:
BACKGROUND: Peak oxygen uptake (peak Vo(2)) is an established integrative measurement of maximal exercise capacity in cardiovascular disease. After heart transplantation (HTx) peak Vo(2) remains reduced despite normal systolic left ventricular function, which highlights the relevance of diastolic function. In this study we aim to characterize the predictive significance of cardiac allograft diastolic function for peak Vo(2). METHODS: Peak Vo(2) was measured using a ramp protocol on a bicycle ergometer. Left ventricular (LV) diastolic function was assessed with tissue Doppler imaging sizing the velocity of the early (Ea) and late (Aa) apical movement of the mitral annulus, and conventional Doppler measuring early (E) and late (A) diastolic transmitral flow propagation. Correlation coefficients were calculated and linear regression models fitted. RESULTS: The post-transplant time interval of the 39 HTxs ranged from 0.4 to 20.1 years. The mean age of the recipients was 55 +/- 14 years and body mass index (BMI) was 25.4 +/- 3.9 kg/m(2). Mean LV ejection fraction was 62 +/- 4%, mean LV mass index 108 +/- 22 g/m(2) and mean peak Vo(2) 20.1 +/- 6.3 ml/kg/min. Peak Vo(2) was reduced in patients with more severe diastolic dysfunction (pseudonormal or restrictive transmitral inflow pattern), or when E/Ea was > or =10. Peak Vo(2) correlated with recipient age (r = -0.643, p < 0.001), peak heart rate (r = 0.616, p < 0.001) and BMI (r = -0.417, p = 0.008). Of all echocardiographic measurements, Ea (r = 0.561, p < 0.001) and Ea/Aa (r = 0.495, p = 0.002) correlated best. Multivariate analysis identified age, heart rate, BMI and Ea/Aa as independent predictors of peak Vo(2). CONCLUSIONS: Diastolic dysfunction is relevant for the limitation of maximal exercise capacity after HTx.
Resumo:
BACKGROUND AND OBJECTIVES: Complement inhibition is considered important in the mechanism of action of intravenous immunoglobulin (IVIG) in a number of inflammatory and autoimmune disorders. The capacity of different IVIG preparations to 'scavenge' activated C3 and thereby inhibit complement activation was assessed by a new in vitro assay. MATERIALS AND METHODS: Diluted human serum as a complement source, with or without addition of different concentrations of IVIG, was incubated in microtitre plates coated with heat-aggregated human IgG. Complement scavenging was measured by detecting reduced C3 binding and determining fluid phase C3b-IgG complex formation. Complement activation induced by the IVIG preparations was measured as C5a formation. RESULTS: All IVIG preparations exhibited a dose-dependent inhibition of C3b deposition, correlating strongly with binding of C3b to fluid-phase IgG, but the extent of complement scavenging varied considerably between different IVIG preparations. At an IVIG concentration of 0.9 mg/ml, the inhibition of C3b deposition ranged from 72 +/- 16% to 22 +/- 4.1%. The reduction of C3b deposition on the complement-activating surface was not due to IVIG-induced complement activation in the fluid phase, as shown by the low C5a formation in the presence of serum. CONCLUSION: In vitro analysis allows comparison of the complement-inhibitory properties of IVIG preparations. The extent of complement scavenging varies between the products.
Resumo:
Fragmentation and vegetative regeneration from small fragments may contribute to population expansion, dispersal and establishment of new populations of introduced plants. However, no study has systematically tested whether a high capacity of vegetative regeneration is associated with a high degree of invasiveness. For small single-node fragments, the presence of internodes may increase regeneration capacity because internodes may store carbohydrates and proteins that can be used for regeneration. We conducted an experiment with 39 stoloniferous plant species to examine the regeneration capacity of small, single-node fragments with or without attached stolon internodes. We asked (1) whether the presence of stolon internodes increases regeneration from single-node fragments, (2) whether regeneration capacity differs between native and introduced species in China, and (3) whether regeneration capacity is positively associated with plant invasiveness at a regional scale (within China) and at a global scale. Most species could regenerate from single-node fragments, and the presence of internodes increased regeneration rate and subsequent growth and/or asexual reproduction. Regeneration capacity varied greatly among species, but showed no relationship to invasiveness, either in China or globally. High regeneration capacity from small fragments may contribute to performance of clonal plants in general, but it does not appear to explain differences in invasiveness among stoloniferous clonal species.
Resumo:
In this paper, we show statistical analyses of several types of traffic sources in a 3G network, namely voice, video and data sources. For each traffic source type, measurements were collected in order to, on the one hand, gain better understanding of the statistical characteristics of the sources and, on the other hand, enable forecasting traffic behaviour in the network. The latter can be used to estimate service times and quality of service parameters. The probability density function, mean, variance, mean square deviation, skewness and kurtosis of the interarrival times are estimated by Wolfram Mathematica and Crystal Ball statistical tools. Based on evaluation of packet interarrival times, we show how the gamma distribution can be used in network simulations and in evaluation of available capacity in opportunistic systems. As a result, from our analyses, shape and scale parameters of gamma distribution are generated. Data can be applied also in dynamic network configuration in order to avoid potential network congestions or overflows. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Building resilience to climate change in agricultural production can ensure the functioning of agricultural-based livelihoods and reduce their vulnerability to climate change impacts. This paper thus explores how buffer capacity, a characteristic feature of resilience, can be conceptualised and used for assessing the resilience of smallholder agriculture to climate change. It uses the case of conservation agriculture farmers in a Kenyan region and examines how their practices contribute to buffer capacity. Surveys were used to collect data from 41 purposely selected conservation agriculture farmers in the Laikipia region of Kenya. Besides descriptive statistics, factor analysis was used to identify the key dimensions that characterise buffer capacity in the study context. The cluster of practices characterising buffer capacity in conservation agriculture include soil protection, adapted crops, intensification/irrigation, mechanisation and livelihood diversification. Various conservation practices increase buffer capacity, evaluated by farmers in economic, social, ecological and other dimensions. Through conservation agriculture, most farmers improved their productivity and incomes despite drought, improved their environment and social relations. Better-off farmers also reduced their need for labour, but this resulted in lesser income-earning opportunities for the poorer farmers, thus reducing the buffer capacity and resilience of the latter.