994 resultados para modern techniques
Resumo:
In 2000 the European Statistical Office published the guidelines for developing theHarmonized European Time Use Surveys system. Under such a unified framework,the first Time Use Survey of national scope was conducted in Spain during 2002–03. The aim of these surveys is to understand human behavior and the lifestyle ofpeople. Time allocation data are of compositional nature in origin, that is, they aresubject to non-negativity and constant-sum constraints. Thus, standard multivariatetechniques cannot be directly applied to analyze them. The goal of this work is toidentify homogeneous Spanish Autonomous Communities with regard to the typicalactivity pattern of their respective populations. To this end, fuzzy clustering approachis followed. Rather than the hard partitioning of classical clustering, where objects areallocated to only a single group, fuzzy method identify overlapping groups of objectsby allowing them to belong to more than one group. Concretely, the probabilistic fuzzyc-means algorithm is conveniently adapted to deal with the Spanish Time Use Surveymicrodata. As a result, a map distinguishing Autonomous Communities with similaractivity pattern is drawn.Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
PURPOSE: Since 1982, the Radiation Oncology Group of the EORTC (EORTC ROG) has pursued an extensive Quality Assurance (QA) program involving all centres actively participating in its clinical research. The first step is the evaluation of the structure and of the human, technical and organisational resources of the centres, to assess their ability to comply with the current requirements for high-tech radiotherapy (RT). MATERIALS AND METHODS: A facility questionnaire (FQ) was developed in 1989 and adapted over the years to match the evolution of RT techniques. We report on the contents of the current FQ that was completed online by 98 active EORTC ROG member institutions from 19 countries, between December 2005 and October 2007. RESULTS: Similar to the data collected previously, large variations in equipment, staffing and workload between centres remain. Currently only 15 centres still use a Cobalt unit. All centres perform 3D Conformal RT, 79% of them can perform IMRT and 54% are able to deliver stereotactic RT. An external reference dosimetry audit (ERDA) was performed in 88% of the centres for photons and in 73% for electrons, but it was recent (<2 years) in only 74% and 60%, respectively. CONCLUSION: The use of the FQ helps maintain the minimum quality requirements within the EORTC ROG network: recommendations are made on the basis of the analysis of its results. The present analysis shows that modern RT techniques are widely implemented in the clinic but also that ERDA should be performed more frequently. Repeated assessment using the FQ is warranted to document the future evolution of the EORTC ROG institutions.
Resumo:
Chest physiotherapy (CP) using passive expiratory manoeuvres is widely used in Western Europe for the treatment of bronchiolitis, despite lacking evidence for its efficacy. We undertook an open randomised trial to evaluate the effectiveness of CP in infants hospitalised for bronchiolitis by comparing the time to clinical stability, the daily improvement of a severity score and the occurrence of complications between patients with and without CP. Children <1 year admitted for bronchiolitis in a tertiary hospital during two consecutive respiratory syncytial virus seasons were randomised to group 1 with CP (prolonged slow expiratory technique, slow accelerated expiratory flow, rarely induced cough) or group 2 without CP. All children received standard care (rhinopharyngeal suctioning, minimal handling, oxygen for saturation ≥92%, fractionated meals). Ninety-nine eligible children (mean age, 3.9 months), 50 in group 1 and 49 in group 2, with similar baseline variables and clinical severity at admission. Time to clinical stability, assessed as primary outcome, was similar for both groups (2.9 ± 2.1 vs. 3.2 ± 2.8 days, P = 0.45). The rate of improvement of a clinical and respiratory score, defined as secondary outcome, only showed a slightly faster improvement of the respiratory score in the intervention group when including stethoacoustic properties (P = 0.044). Complications were rare but occurred more frequently, although not significantly (P = 0.21), in the control arm. In conclusion, this study shows the absence of effectiveness of CP using passive expiratory techniques in infants hospitalised for bronchiolitis. It seems justified to recommend against the routine use of CP in these patients.
Resumo:
BACKGROUND: Estimates of drug resistance incidence to modern first-line combination antiretroviral therapies against human immunodeficiency virus (HIV) type 1 are complicated by limited availability of genotypic drug resistance tests (GRTs) and uncertain timing of resistance emergence. METHODS: Five first-line combinations were studied (all paired with lamivudine or emtricitabine): efavirenz (EFV) plus zidovudine (AZT) (n = 524); EFV plus tenofovir (TDF) (n = 615); lopinavir (LPV) plus AZT (n = 573); LPV plus TDF (n = 301); and ritonavir-boosted atazanavir (ATZ/r) plus TDF (n = 250). Virological treatment outcomes were classified into 3 risk strata for emergence of resistance, based on whether undetectable HIV RNA levels were maintained during therapy and, if not, whether viral loads were >500 copies/mL during treatment. Probabilities for presence of resistance mutations were estimated from GRTs (n = 2876) according to risk stratum and therapy received at time of testing. On the basis of these data, events of resistance emergence were imputed for each individual and were assessed using survival analysis. Imputation was repeated 100 times, and results were summarized by median values (2.5th-97.5th percentile range). RESULTS: Six years after treatment initiation, EFV plus AZT showed the highest cumulative resistance incidence (16%) of all regimens (<11%). Confounder-adjusted Cox regression confirmed that first-line EFV plus AZT (reference) was associated with a higher median hazard for resistance emergence, compared with other treatments: EFV plus TDF (hazard ratio [HR], 0.57; range, 0.42-0.76), LPV plus AZT (HR, 0.63; range, 0.45-0.89), LPV plus TDF (HR, 0.55; range, 0.33-0.83), ATZ/r plus TDF (HR, 0.43; range, 0.17-0.83). Two-thirds of resistance events were associated with detectable HIV RNA level ≤500 copies/mL during treatment, and only one-third with virological failure (HIV RNA level, >500 copies/mL). CONCLUSIONS: The inclusion of TDF instead of AZT and ATZ/r was correlated with lower rates of resistance emergence, most likely because of improved tolerability and pharmacokinetics resulting from a once-daily dosage.
Resumo:
Taking on the challenge of understanding and explaining the Symphony of (today’s) New World in realistic terms (not realist), this essay aims to analyse the Post-Cold war era by devising a multi-conceptual framework that combines different theoretical contributions not yet linked in a fully explanatory way. This paper suggests two inter-related analytical contexts (or background melodies) to understand Dvorak´s "New World”. First, the socio-economic structural context that falls under the controversial category of Globalization and, second, the post-modern political structural context that is built on Robert Cooper’s threefold analysis (Pre-modern, Modern and Post-modern) of today’s world [Cooper, R: 1997, 1999]. Lastly, the closing movement (allegro con fuoco) enters the normative arena to assess American foreign policy options in the light of the theoretical framework devised in the first part of the essay.
Resumo:
[Vente (Livres). 1897-11-11. Londres]
Resumo:
Echocardiography is the preferred initial test to assess cardiac morphology and ventricular function. Cardiac MRI enables an optimal visualisation of heart muscle without contrast injection, and precise measurement of the ventricular volumes and systolic function. It is therefore an ideal test for patients with poor echocardiographic windows or for the specific evaluation of right heart chambers. Heart CT also remarkably images heart muscle and precisely measures ventricular systolic function after intravenous injection of iodinated contrast. Coronary CT may also, in selected cases, avoid the need for diagnostic coronary angiography. Although very accurate, these imaging modalities are expensive and may be contra-indicated for a particular patient. Their use in clinical practice has to follow the accepted guidelines.
Resumo:
This paper presents a pattern recognition method focused on paintings images. The purpose is construct a system able to recognize authors or art styles based on common elements of his work (here called patterns). The method is based on comparing images that contain the same or similar patterns. It uses different computer vision techniques, like SIFT and SURF, to describe the patterns in descriptors, K-Means to classify and simplify these descriptors, and RANSAC to determine and detect good results. The method are good to find patterns of known images but not so good if they are not.
Resumo:
In European countries and North America, people spend 80 to 90% of time inside buildings and thus breathe indoor air. In Switzerland, special attention has been devoted to the 16 stations of the national network of observation of atmospheric pollutants (NABEL). The results indicate a reduction in outdoor pollution over the last ten years. With such a decrease in pollution over these ten years the question becomes: how can we explain an increase of diseases? Indoor pollution can be the cause. Indoor contaminants that may create indoor air quality (IAQ) problems come from a variety of sources. These can include inadequate ventilation, temperature and humidity dysfunction, and volatile organic compounds (VOCs). The health effects from these contaminants are varied and can range from discomfort, irritation and respiratory diseases to cancer. Among such contaminants, environmental tobacco smoke (ETS) could be considered the most important in terms of both health effects and engineering controls of ventilation. To perform indoor pollution monitoring, several selected ETS tracers can be used including carbon monoxide (CO), carbon dioxide (CO2), respirable particles (RSP), condensate, nicotine, polycyclic aromatic hydrocarbons (PAHs), nitrosamines, etc. In this paper, some examples are presented of IAQ problems that have occurred following the renewal of buildings and energy saving concerns. Using industrial hygiene sampling techniques and focussing on selected priority pollutants used as tracers, various problems have been identified and solutions proposed. [Author]
Resumo:
Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system
Resumo:
The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented
Resumo:
Percutaneous ablative procedures allow curative treatment of stage BCLC 0 or BCLC A hepatocellular carcinoma, as well as liver metastases of colorectal cancer. Several methods exist including radiofrequency ablation, the most commonly used. These techniques can be used in combination with surgical excision or alone if surgery is contraindicated. They are associated with significantly reduced mortality as compared to surgery.