975 resultados para Vulcanization characteristics based on accelerator combinations
Resumo:
Tese de Doutoramento em Engenharia de Materiais.
Resumo:
Programa Doutoral em Engenharia Biomédica
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
PURPOSE: The aim of this study was to determine whether tumor location proximal or distal to the splenic flexure is associated with distinct molecular patterns and can predict clinical outcome in a homogeneous group of patients with Dukes B (T3-T4, N0, M0) colorectal cancer. It has been hypothesized that proximal and distal colorectal cancer may arise through different pathogenetic mechanisms. Although p53 and Ki-ras gene mutations occur frequently in distal tumors, another form of genomic instability associated with defective DNA mismatch repair has been predominantly identified in the proximal colon. To date, however, the clinical usefulness of these molecular characteristics remains unproven. METHODS: A total of 126 patients with a lymph node-negative sporadic colon or rectum adenocarcinoma were prospectively assessed with the endpoint of death by cancer. No patient received either radiotherapy or chemotherapy. p53 protein was studied by immunohistochemistry using DO-7 monoclonal antibody, and p53 and Ki-ras gene mutations were detected by single strand conformation polymorphism assay. RESULTS: During a mean follow-up of 67 months, the overall five-year survival was 70 percent. Nuclear p53 staining was found in 57 tumors (47 percent), and was more frequent in distal than in proximal tumors (55 vs. 21 percent; chi-squared test, P < 0.001). For the whole group, p53 protein expression correlated with poor survival in univariate and multivariate analysis (log-rank test, P = 0.01; hazard ratio = 2.16; 95 percent confidence interval = 1.12-4.11, P = 0.02). Distal colon tumors and rectal tumors exhibited similar molecular patterns and showed no difference in clinical outcome. In comparison with distal colorectal cancer, proximal tumors were found to be statistically significantly different on the following factors: mucinous content (P = 0.008), degree of histologic differentiation (P = 0.012), p53 protein expression, and gene mutation (P = 0.001 and 0.01 respectively). Finally, patients with proximal tumors had a marginally better survival than those with distal colon or rectal cancers (log-rank test, P = 0.045). CONCLUSION: In this series of Dukes B colorectal cancers, p53 protein expression was an independent factor for survival, which also correlated with tumor location. Eighty-six percent of p53-positive tumors were located in the distal colon and rectum. Distal colon and rectum tumors had similar molecular and clinical characteristics. In contrast, proximal neoplasms seem to represent a distinct entity, with specific histopathologic characteristics, molecular patterns, and clinical outcome. Location of the neoplasm in reference to the splenic flexure should be considered before group stratification in future trials of adjuvant chemotherapy in patients with Dukes B tumors.
Resumo:
On December 4th 2007, a 3-Mm3 landslide occurred along the northwestern shore of Chehalis Lake. The initiation zone is located at the intersection of the main valley slope and the northern sidewall of a prominent gully. The slope failure caused a displacement wave that ran up to 38 m on the opposite shore of the lake. The landslide is temporally associated with a rain-on-snow meteorological event which is thought to have triggered it. This paper describes the Chehalis Lake landslide and presents a comparison of discontinuity orientation datasets obtained using three techniques: field measurements, terrestrial photogrammetric 3D models and an airborne LiDAR digital elevation model to describe the orientation and characteristics of the five discontinuity sets present. The discontinuity orientation data are used to perform kinematic, surface wedge limit equilibrium and three-dimensional distinct element analyses. The kinematic and surface wedge analyses suggest that the location of the slope failure (intersection of the valley slope and a gully wall) has facilitated the development of the unstable rock mass which initiated as a planar sliding failure. Results from the three-dimensional distinct element analyses suggest that the presence, orientation and high persistence of a discontinuity set dipping obliquely to the slope were critical to the development of the landslide and led to a failure mechanism dominated by planar sliding. The three-dimensional distinct element modelling also suggests that the presence of a steeply dipping discontinuity set striking perpendicular to the slope and associated with a fault exerted a significant control on the volume and extent of the failed rock mass but not on the overall stability of the slope.
Resumo:
We tested the efficacy and safety of different combination therapies in hypertensive patients with uncontrolled blood pressure (BP) on a monotherapy with a calcium antagonist: 1,647 hypertensive patients were enrolled to receive placebo for 4 weeks followed by isradipine (ISR) 2.5 mg twice daily (b.i.d.) for 4 weeks. Nonresponders [diastolic BP (DBP) > 90 mm Hg] were randomly assigned to receive either the beta-blocker bopindolol 0.5 or 1 mg/day, the diuretic metolazone 1.25 or 2.5 mg/day, the angiotensin-converting enzyme (ACE) inhibitor enalapril 10 or 20 mg/day, ISR 5 mg b.i.d., or placebo. One hundred seventy-five receiving placebo dropped out; 93% (n = 1,376) of the 1,472 patients finished 4-week monotherapy with ISR. Sixty percent (n = 826) reached target BP, and 40% (n = 550) remained uncontrolled and were randomized. Regardless of dosage, all drugs led to a comparable reduction in BP except for the lower dosage of bopindolol and ISR 5 mg b.i.d., which were less effective in lowering systolic BP (SBP). The BP decrease achieved by combination therapy ranged from 10 to 15 mm Hg SBP and from 7 to 11 mm Hg DBP but remained unchanged with placebo. Side effects were minor, and only 2.4% of patients discontinued therapy because of side effects. The side-effect score for edema was lower with ISR plus diuretics than with other combinations, whereas the ACE inhibitor was associated with a higher score for cough. Monotherapy with a calcium antagonist normalizes BP in about two-thirds of patients when used in general practice.(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Background: Many studies have found considerable variations in the resource intensity of physical therapy episodes. Although they have identified several patient-and provider-related factors, few studies have examined their relative explanatory power. We sought to quantify the contribution of patients and providers to these differences and examine how effective Swiss regulations are (nine-session ceiling per prescription and bonus for first treatments). Methods: Our sample consisted of 87,866 first physical therapy episodes performed by 3,365 physiotherapists based on referrals by 6,131 physicians. We modeled the number of visits per episode using a multilevel log linear regression with crossed random effects for physiotherapists and physicians and with fixed effects for cantons. The three-level explanatory variables were patient, physiotherapist and physician characteristics. Results: The median number of sessions was nine (interquartile range 6-13). Physical therapy use increased with age, women, higher health care costs, lower deductibles, surgery and specific conditions. Use rose with the share of nine-session episodes among physiotherapists or physicians, but fell with the share of new treatments. Geographical area had no influence. Most of the variance was explained at the patient level, but the available factors explained only 4% thereof. Physiotherapists and physicians explained only 6% and 5% respectively of the variance, although the available factors explained most of this variance. Regulations were the most powerful factors. Conclusion: Against the backdrop of abundant physical therapy supply, Swiss financial regulations did not restrict utilization. Given that patient-related factors explained most of the variance, this group should be subject to closer scrutiny. Moreover, further research is needed on the determinants of patient demand.
Resumo:
The genetic variation and population structure of three populations of Anopheles darlingi from Colombia were studied using random amplified polymorphic markers (RAPDs) and amplified fragment length polymorphism markers (AFLPs). Six RAPD primers produced 46 polymorphic fragments, while two AFLP primer combinations produced 197 polymorphic fragments from 71 DNA samples. Both of the evaluated genetic markers showed the presence of gene flow, suggesting that Colombian An. darlingi populations are in panmixia. Average genetic diversity, estimated from observed heterozygosity, was 0.374 (RAPD) and 0.309 (AFLP). RAPD and AFLP markers showed little evidence of geographic separation between eastern and western populations; however, the F ST values showed high gene flow between the two western populations (RAPD: F ST = 0.029; Nm: 8.5; AFLP: F ST = 0.051; Nm: 4.7). According to molecular variance analysis (AMOVA), the genetic distance between populations was significant (RAPD:phiST = 0.084; AFLP:phiST = 0.229, P < 0.001). The F ST distances and AMOVAs using AFLP loci support the differentiation of the Guyana biogeographic province population from those of the Chocó-Magdalena. In this last region, Chocó and Córdoba populations showed the highest genetic flow.
Resumo:
Objective To compute the burden of cancer attributable to current and former alcohol consumption in eight European countries based on direct relative risk estimates from a cohort study. Design Combination of prospective cohort study with representative population based data on alcohol exposure. Setting Eight countries (France, Italy, Spain, United Kingdom, the Netherlands, Greece, Germany, Denmark) participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Participants 109 118 men and 254 870 women, mainly aged 37-70. Main outcome measures Hazard rate ratios expressing the relative risk of cancer incidence for former and current alcohol consumption among EPIC participants. Hazard rate ratios combined with representative information on alcohol consumption to calculate alcohol attributable fractions of causally related cancers by country and sex. Partial alcohol attributable fractions for consumption higher than the recommended upper limit (two drinks a day for men with about 24 g alcohol, one for women with about 12 g alcohol) and the estimated total annual number of cases of alcohol attributable cancer. Results If we assume causality, among men and women, 10% (95% confidence interval 7 to 13%) and 3% (1 to 5%) of the incidence of total cancer was attributable to former and current alcohol consumption in the selected European countries. For selected cancers the figures were 44% (31 to 56%) and 25% (5 to 46%) for upper aerodigestive tract, 33% (11 to 54%) and 18% (−3 to 38%) for liver, 17% (10 to 25%) and 4% (−1 to 10%) for colorectal cancer for men and women, respectively, and 5.0% (2 to 8%) for female breast cancer. A substantial part of the alcohol attributable fraction in 2008 was associated with alcohol consumption higher than the recommended upper limit: 33 037 of 178 578 alcohol related cancer cases in men and 17 470 of 397 043 alcohol related cases in women. Conclusions In western Europe, an important proportion of cases of cancer can be attributable to alcohol consumption, especially consumption higher than the recommended upper limits. These data support current political efforts to reduce or to abstain from alcohol consumption to reduce the incidence of cancer.
Resumo:
Usually, psychometricians apply classical factorial analysis to evaluate construct validity of order rankscales. Nevertheless, these scales have particular characteristics that must be taken into account: totalscores and rank are highly relevant
Resumo:
This paper focuses on the problem of realizing a plane-to-plane virtual link between a camera attached to the end-effector of a robot and a planar object. In order to do the system independent to the object surface appearance, a structured light emitter is linked to the camera so that 4 laser pointers are projected onto the object. In a previous paper we showed that such a system has good performance and nice characteristics like partial decoupling near the desired state and robustness against misalignment of the emitter and the camera (J. Pages et al., 2004). However, no analytical results concerning the global asymptotic stability of the system were obtained due to the high complexity of the visual features utilized. In this work we present a better set of visual features which improves the properties of the features in (J. Pages et al., 2004) and for which it is possible to prove the global asymptotic stability
Resumo:
This paper presents a study of connection availability in GMPLS over optical transport networks (OTN) taking into account different network topologies. Two basic path protection schemes are considered and compared with the no protection case. The selected topologies are heterogeneous in geographic coverage, network diameter, link lengths, and average node degree. Connection availability is also computed considering the reliability data of physical components and a well-known network availability model. Results show several correspondences between suitable path protection algorithms and several network topology characteristics
Resumo:
This paper introduces Collage, a high-level IMS-LD compliant authoring tool that is specialized for CSCL (Computer-Supported Collaborative Learning). Nowadays CSCL is a key trend in elearning since it highlights the importance of social interactions as an essential element of learning. CSCL is an interdisciplinary domain, which demands participatory design techniques that allow teachers to get directly involved in design activities. Developing CSCL designs using LD is a difficult task for teachers since LD is a complex technical specification and modelling collaborative characteristics can be tricky. Collage helps teachers in the process of creating their own potentially effective collaborative Learning Designs by reusing and customizing patterns, according to the requirements of a particular learning situation. These patterns, called Collaborative Learning Flow Patterns (CLFPs), represent best practices that are repetitively used by practitioners when structuring the flow of (collaborative) learning activities. An example of an LD that can be created using Collage is illustrated in the paper. Preliminary evaluation results show that teachers, with experience in CL but without LD knowledge, can successfully design real collaborative learning experiences using Collage.
Resumo:
Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7,596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays, 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and the percentage of positive outcomes shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time.
Resumo:
The diagnosis of idiopathic Parkinson's disease (IPD) is entirely clinical. The fact that neuronal damage begins 5-10 years before occurrence of sub-clinical signs, underlines the importance of preclinical diagnosis. A new approach for in-vivo pathophysiological assessment of IPD-related neurodegeneration was implemented based on recently developed neuroimaging methods. It is based on non- invasive magnetic resonance data sensitive to brain tissue property changes that precede macroscopic atrophy in the early stages of IPD. This research aims to determine the brain tissue property changes induced by neurodegeneration that can be linked to clinical phenotypes which will allow us to create a predictive model for early diagnosis in IPD. We hypothesized that the degree of disease progression in IPD patients will have a differential and specific impact on brain tissue properties used to create a predictive model of motor and non-motor impairment in IPD. We studied the potential of in-vivo quantitative imaging sensitive to neurodegeneration- related brain tissue characteristics to detect changes in patients with IPD. We carried out methodological work within the well established SPM8 framework to estimate the sensitivity of tissue probability maps for automated tissue classification for detection of early IPD. We performed whole-brain multi parameter mapping at high resolution followed by voxel-based morphometric (VBM) analysis and voxel-based quantification (VBQ) comparing healthy subjects to IPD patients. We found a trend demonstrating non-significant tissue property changes in the olfactory bulb area using the MT and R1 parameter with p<0.001. Comparing to the IPD patients, the healthy group presented a bilateral higher MT and R1 intensity in this specific functional region. These results did not correlate with age, severity or duration of disease. We failed to demonstrate any changes with the R2* parameter. We interpreted our findings as demyelination of the olfactory tract, which is clinically represented as anosmia. However, the lack of correlation with duration or severity complicates its implications in the creation of a predictive model of impairment in IPD.