845 resultados para Box-Cox transformation and quintile-based capability indices
Reductive dechlorination of TCE and cis-DCE by zero-valent iron and iron-based bimetallic reductants
Resumo:
CEs are the most frequently detected pollutants in groundwater. Several studies have been shown iron-based bimetallic reductants as a good method toward to chlorinated ethylenes degradation. However, many fundamental issues surrounding the chemistry of this phenomena remains elusive. In this study, kinetics and compound specific isotope analysis for reductive dechlorination of TCE and cis-DCE by unamended iron and iron-based bimetal reductants was evaluated. Generally, all the bimetals reductants tested revealed to increase the reactivity of the degradation, in which palladium and nickel were the additional metals more reactive. Ethene and ethane were the major products of TCE degradation. It is supported the simultaneous hydrogenolysis and β-elimination reaction hypothesis, however, the first step of TCE degradation by Au/Fe undergoes preferably by β-elimination, while by unamended iron, Pt/Fe and Co/Fe goes preferably by hydrogenolysis. No apparent elucidation was obtained to explain the high reactivity on bimetals systems; Degradação do TCE e cis-DCE por ferro de valência zero e redutores bimetálicos à base de ferro Resumo: Etilenos clorados são os poluentes mais frequentemente detetados na água subterrânea. Vários estudos têm mostrado que redutores bimetálicos à base de ferro são um bom método para a degradação dos etilenos clorados. Porém, muitas questões fundamentais acerca da química deste fenómeno permanecem elusivas. Neste estudo foi avaliada a cinética e a análise isotópica de compostos específicos para a degradação do TCE e cis-DCE por ferro e redutores bimetálicos à base de ferro. Genericamente, os redutores bimetálicos mostraram aumentar a reatividade da degradação, sendo paládio e níquel os metais adicionais mais reativos. Os produtos principais da degradação do TCE foram eteno e etano. É apoiada a hipótese da simultaneidade de hidrogenólise e β-eliminação, porém, o primeiro passo da degradação do TCE por Au/Fe é realizada preferencialmente por β-eliminação, enquanto por ferro, Pt/Fe e Co/Fe é realizada preferencialmente por hidrogenólise. Não houve uma elucidação aparente para explicar a reatividade nos sistemas bimetálicos.
Resumo:
The functional and structural performance of a 5 cm synthetic small diameter vascular graft (SDVG) produced by the copolymerization of polyvinyl alcohol hydrogel with low molecular weight dextran (PVA/Dx graft) associated to mesenchymal stem cells (MSCs)-based therapies and anticoagulant treatment with heparin, clopidogrel and warfarin was tested using the ovine model during the healing period of 24 weeks. The results were compared to the ones obtained with standard expanded polyetetrafluoroethylene grafts (ePTFE graft). Blood flow, vessel and graft diameter measurements, graft appearance and patency rate (PR), thrombus, stenosis and collateral vessel formation were evaluated by B-mode ultrasound, audio and color flow Doppler. Graft and regenerated vessels morphologic evaluation was performed by scanning electronic microscopy (SEM), histopathological and immunohistochemical analysis. All PVA/Dx grafts could maintain a similar or higher PR and systolic / diastolic laminar blood flow velocities were similar to ePTFE grafts. CD14 (macrophages) and α-actin (smooth muscle) staining presented similar results in PVA/Dx/MSCs and ePTFE graft groups. Fibrosis layer was lower and endothelial cells were only detected at graft-artery transitions where it was added the MSCs. In conclusion, PVA/Dx graft can be an excellent scaffold candidate for vascular reconstruction, including clinic mechanically challenging applications, such as SDVGs, especially when associated to MSCs-based therapies to promote higher endothelialization and lower fibrosis of the vascular prosthesis, but also higher PR values.
Resumo:
2016
Resumo:
Engine developers are putting more and more emphasis on the research of maximum thermal and mechanical efficiency in the recent years. Research advances have proven the effectiveness of downsized, turbocharged and direct injection concepts, applied to gasoline combustion systems, to reduce the overall fuel consumption while respecting exhaust emissions limits. These new technologies require more complex engine control units. The sound emitted from a mechanical system encloses many information related to its operating condition and it can be used for control and diagnostic purposes. The thesis shows how the functions carried out from different and specific sensors usually present on-board, can be executed, at the same time, using only one multifunction sensor based on low-cost microphone technology. A theoretical background about sound and signal processing is provided in chapter 1. In modern turbocharged downsized GDI engines, the achievement of maximum thermal efficiency is precluded by the occurrence of knock. Knock emits an unmistakable sound perceived by the human ear like a clink. In chapter 2, the possibility of using this characteristic sound for knock control propose, starting from first experimental assessment tests, to the implementation in a real, production-type engine control unit will be shown. Chapter 3 focus is on misfire detection. Putting emphasis on the low frequency domain of the engine sound spectrum, features related to each combustion cycle of each cylinder can be identified and isolated. An innovative approach to misfire detection, which presents the advantage of not being affected by the road and driveline conditions is introduced. A preliminary study of air path leak detection techniques based on acoustic emissions analysis has been developed, and the first experimental results are shown in chapter 4. Finally, in chapter 5, an innovative detection methodology, based on engine vibration analysis, that can provide useful information about combustion phase is reported.
Resumo:
Biomarkers are biological indicators of human health conditions. Their ultra-sensitive quantification is of paramount importance in clinical monitoring and early disease diagnosis. Biosensors are simple and easy-to-use analytical devices and, in their world, electrochemiluminescence (ECL) is one of the most promising analytical techniques that needs an ever-increasing sensitivity for improving its clinical effectiveness. Scope of this project was the investigation of the ECL generation mechanisms for enhancing the ECL intensity also through the identification of suitable nanostructures. The combination of nanotechnologies, microscopy and ECL has proved to be a very successful strategy to improve the analytical efficiency of ECL in one of its most promising bioanalytical approaches, the bead-based immunoassay. Nanosystems, such as [Ru(bpy)3]2+-dye-doped nanoparticles (DDSNPs) and Bodipy Carbon Nanodots, have been used to improve the sensitivity of ECL techniques thanks to their advantageous and tuneable properties, reaching a signal increase of 750% in DDSNPs-bead-based immunoassay system. In this thesis, an investigation of size and distance effects on the ECL mechanisms was carried out through the innovative combination of ECL microscopy and electrochemical mapping of radicals. It allowed the discovery of an unexpected and highly efficient mechanistic path for ECL generation at small distances from the electrode surface. It was exploited and enhanced through the addition of a branched amine DPIBA to the usual coreactant TPrA solution for enhancing the ECL efficiency until a maximum of 128%. Finally, a beads-based immunoassay and an immunosensor specific for cardiac Troponin I were built exploiting previous results and carbon nanotubes features. They created a conductive layer around beads enhancing the signal by 70% and activating an ECL mechanism unobserved before in such systems. In conclusion, the combination of ECL microscopy and nanotechnology and the deep understanding of the mechanisms responsible for the ECL emission led to a great enhancement in the signal.
Resumo:
Natural Language Processing has always been one of the most popular topics in Artificial Intelligence. Argument-related research in NLP, such as argument detection, argument mining and argument generation, has been popular, especially in recent years. In our daily lives, we use arguments to express ourselves. The quality of arguments heavily impacts the effectiveness of our communications with others. In professional fields, such as legislation and academic areas, arguments of good quality play an even more critical role. Therefore, argument generation with good quality is a challenging research task that is also of great importance in NLP. The aim of this work is to investigate the automatic generation of arguments with good quality, according to the given topic, stance and aspect (control codes). To achieve this goal, a module based on BERT [17] which could judge an argument's quality is constructed. This module is used to assess the quality of the generated arguments. Another module based on GPT-2 [19] is implemented to generate arguments. Stances and aspects are also used as guidance when generating arguments. After combining all these models and techniques, the ranks of the generated arguments could be acquired to evaluate the final performance. This dissertation describes the architecture and experimental setup, analyzes the results of our experimentation, and discusses future directions.
Resumo:
Real world search problems, characterised by nonlinearity, noise and multidimensionality, are often best solved by hybrid algorithms. Techniques embodying different necessary features are triggered at specific iterations, in response to the current state of the problem space. In the existing literature, this alternation is managed either statically (through pre-programmed policies) or dynamically, at the cost of high coupling with algorithm inner representation. We extract two design patterns for hybrid metaheuristic search algorithms, the All-Seeing Eye and the Commentator patterns, which we argue should be replaced by the more flexible and loosely coupled Simple Black Box (Two-B) and Utility-based Black Box (Three-B) patterns that we propose here. We recommend the Two-B pattern for purely fitness based hybridisations and the Three-B pattern for more generic search quality evaluation based hybridisations.
Resumo:
Nonlinear regression problems can often be reduced to linearity by transforming the response variable (e.g., using the Box-Cox family of transformations). The classic estimates of the parameter defining the transformation as well as of the regression coefficients are based on the maximum likelihood criterion, assuming homoscedastic normal errors for the transformed response. These estimates are nonrobust in the presence of outliers and can be inconsistent when the errors are nonnormal or heteroscedastic. This article proposes new robust estimates that are consistent and asymptotically normal for any unimodal and homoscedastic error distribution. For this purpose, a robust version of conditional expectation is introduced for which the prediction mean squared error is replaced with an M scale. This concept is then used to develop a nonparametric criterion to estimate the transformation parameter as well as the regression coefficients. A finite sample estimate of this criterion based on a robust version of smearing is also proposed. Monte Carlo experiments show that the new estimates compare favorably with respect to the available competitors.
Resumo:
There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
OBJECTIVES: In this population-based study, reference values were generated for renal length, and the heritability and factors associated with kidney length were assessed. METHODS: Anthropometric parameters and renal ultrasound measurements were assessed in randomly selected nuclear families of European ancestry (Switzerland). The adjusted narrow sense heritability of kidney size parameters was estimated by maximum likelihood assuming multivariate normality after power transformation. Gender-specific reference centiles were generated for renal length according to body height in the subset of non-diabetic non-obese participants with normal renal function. RESULTS: We included 374 men and 419 women (mean ± SD, age 47 ± 18 and 48 ± 17 years, BMI 26.2 ± 4 and 24.5 ± 5 kg/m(2), respectively) from 205 families. Renal length was 11.4 ± 0.8 cm in men and 10.7 ± 0.8 cm in women; there was no difference between right and left renal length. Body height, weight and estimated glomerular filtration rate (eGFR) were positively associated with renal length, kidney function negatively, age quadratically, whereas gender and hypertension were not. The adjusted heritability estimates of renal length and volume were 47.3 ± 8.5 % and 45.5 ± 8.8 %, respectively (P < 0.001). CONCLUSION: The significant heritability of renal length and volume highlights the familial aggregation of this trait, independently of age and body size. Population-based references for renal length provide a useful guide for clinicians. KEY POINTS: • Renal length and volume are heritable traits, independent of age and size. • Based on a European population, gender-specific reference values/percentiles are provided for renal length. • Renal length correlates positively with body length and weight. • There was no difference between right and left renal lengths in this study. • This negates general teaching that the left kidney is larger and longer.
Resumo:
BACKGROUND: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. OBJECTIVE: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. METHODS: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child's 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents' socioeconomic status, environmental gamma radiation, and period effects. RESULTS: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. CONCLUSIONS: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.
Resumo:
Cyclooxygenase-2 (COX-2), a key enzyme in prostaglandin synthesis, is highly expressed during inflammation and cellular transformation and promotes tumor progression and angiogenesis. We have previously demonstrated that endothelial cell COX-2 is required for integrin alphaVbeta3-dependent activation of Rac-1 and Cdc-42 and for endothelial cell spreading, migration, and angiogenesis (Dormond, O., Foletti, A., Paroz, C., and Ruegg, C. (2001) Nat. Med. 7, 1041-1047; Dormond, O., Bezzi, M., Mariotti, A., and Ruegg, C. (2002) J. Biol. Chem. 277, 45838-45846). In this study, we addressed the question of whether integrin-mediated cell adhesion may regulate COX-2 expression in endothelial cells. We report that cell detachment from the substrate caused rapid degradation of COX-2 protein in human umbilical vein endothelial cells (HUVEC) independent of serum stimulation. This effect was prevented by broad inhibition of cellular proteinases and by neutralizing lysosomal activity but not by inhibiting the proteasome. HUVEC adhesion to laminin, collagen I, fibronectin, or vitronectin induced rapid COX-2 protein expression with peak levels reached within 2 h and increased COX-2-dependent prostaglandin E2 production. In contrast, nonspecific adhesion to poly-L-lysine was ineffective in inducing COX-2 expression. Furthermore, the addition of matrix proteins in solution promoted COX-2 protein expression in suspended or poly-L-lysine-attached HUVEC. Adhesion-induced COX-2 expression was strongly suppressed by pharmacological inhibition of c-Src, phosphatidylinositol 3-kinase, p38, extracellular-regulated kinase 1/2, and, to a lesser extent, protein kinase C and by the inhibition of mRNA or protein synthesis. In conclusion, this work demonstrates that integrin-mediated cell adhesion and soluble integrin ligands contribute to maintaining COX-2 steady-state levels in endothelial cells by the combined prevention of lysosomal-dependent degradation and the stimulation of mRNA synthesis involving multiple signaling pathways.
Resumo:
Tämä diplomityö käsittelee työkaluja, jotka on suunniteltu kustannusten ennakointiin ja hinnan asetantaan. Aluksi on käyty läpi perinteisen ja toimintoperusteisen kustannuslaskennan perusteita. Näiden menetelmien välisiä eroja on tarkasteltu ja toimintoperusteisen kustannuslaskennan paremmin sopivuus nykypäivän yrityksille on perusteltu. Toisena käsitellään hinnoittelu. Hinnan merkitys, hinnoittelumenetelmät ja päätös lopullisesta hinnasta on käyty läpi. Hinnoittelun jälkeen esitellään kustannusjärjestelmät ja kustannusten arviointi. Nämä asiat todistavat, että tarkat kustannusarviot ovat elintärkeitä yritykselle. Tuotteen kustannusarviointi, hinnan asetanta ja tarjoaminen ovat erittäin merkityksellisiä asioita ottaen huomioon koko projektin elinkaaren ja tulevat tuotot. Nykyään on yleistä käyttää työkaluja kustannusarvioinnissa ja joskus myös hinnoittelussa. Työkalujen luotettavuus on tiedettävä, ennenkuin työkalut otetaan käyttöön. Myös työkalujen käyttäjät täytyy perehdyttää hyvin. Muuten yritys todennäköisesti kohtaa odottamattomia ja epämiellyttäviä yllätyksiä.