927 resultados para Non-ionic surfactant. Cloud point. Flory-Huggins model. UNIQUAC model. NRTL model
Resumo:
Trabalho Final de mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica
Resumo:
Dissertação de Mestrado apresentada ao ISPA - Instituto Universitário
Resumo:
Poly(vinylidene fluoride-co-chlorotrifluoroethylene), PVDF-CTFE, membranes were prepared by solven casting from dimethylformamide, DMF. The preparation conditions involved a systematic variation of polymer/solvent ratio and solvent evaporation temperature. The microstructural variations of the PVDF-CTFE membranes depend on the different regions of the PVDF-CTFE/DMF phase diagram, explained by the Flory-Huggins theory. The effect of the polymer/solvent ratio and solvent evaporation temperature on the morphology, degree of porosity, β-phase content, degree of crystallinity, mechanical, dielectric and piezoelectric properties of the PVDF-CTFE polymer were evaluated. In this binary system, the porous microstructure is attributed to a spinodal decomposition of the liquid-liquid phase separation. For a given polymer/solvent ratio, 20 wt%, and higher evaporation solvent temperature, the β-phase content is around 82% and the piezoelectric coefficient, d33, is - 4 pC/N.
Resumo:
OBJECTIVE: To investigate the role of hemodynamic changes occurring during acute MI in subsequent fibrosis deposition within non-MI. METHODS: By using the rat model of MI, 3 groups of 7 rats each [sham, SMI (MI <30%), and LMI (MI >30%)] were compared. Systemic and left ventricular (LV) hemodynamics were recorded 10 minutes before and after coronary artery ligature. Collagen volume fraction (CVF) was calculated in picrosirius red-stained heart tissue sections 4 weeks later. RESULTS: Before surgery, all hemodynamic variables were comparable among groups. After surgery, LV end-diastolic pressure increased and coronary driving pressure decreased significantly in the LMI compared with the sham group. LV dP/dt max and dP/dt min of both the SMI and LMI groups were statistically different from those of the sham group. CVF within non-MI interventricular septum and right ventricle did not differ between each MI group and the sham group. Otherwise, subendocardial (SE) CVF was statistically greater in the LMI group. SE CVF correlated negatively with post-MI systemic blood pressure and coronary driving pressure, and positively with post-MI LV dP/dt min. Stepwise regression analysis identified post-MI coronary driving pressure as an independent predictor of SE CVF. CONCLUSION: LV remodeling in rats with MI is characterized by predominant SE collagen deposition in non-MI and results from a reduction in myocardial perfusion pressure occurring early on in the setting of MI.
Resumo:
Abstract Background: Cardiac resynchronization therapy (CRT) is the recommended treatment by leading global guidelines. However, 30%-40% of selected patients are non-responders. Objective: To develop an echocardiographic model to predict cardiac death or transplantation (Tx) 1 year after CRT. Method: Observational, prospective study, with the inclusion of 116 patients, aged 64.89 ± 11.18 years, 69.8% male, 68,1% in NYHA FC III and 31,9% in FC IV, 71.55% with left bundle-branch block, and median ejection fraction (EF) of 29%. Evaluations were made in the pre‑implantation period and 6-12 months after that, and correlated with cardiac mortality/Tx at the end of follow-up. Cox and logistic regression analyses were performed with ROC and Kaplan-Meier curves. The model was internally validated by bootstrapping. Results: There were 29 (25%) deaths/Tx during follow-up of 34.09 ± 17.9 months. Cardiac mortality/Tx was 16.3%. In the multivariate Cox model, EF < 30%, grade III/IV diastolic dysfunction and grade III mitral regurgitation at 6‑12 months were independently related to increased cardiac mortality or Tx, with hazard ratios of 3.1, 4.63 and 7.11, respectively. The area under the ROC curve was 0.78. Conclusion: EF lower than 30%, severe diastolic dysfunction and severe mitral regurgitation indicate poor prognosis 1 year after CRT. The combination of two of those variables indicate the need for other treatment options.
Resumo:
Un dels problemes associats a la remeiació de contaminants hidròfobs és la seva poca disponibilitat. Es considera que un contaminant està disponible quan roman a la fase líquida del medi, ja sigui solubilitzat o en forma d’emulsió. Els surfactants són substàncies anfifíliques que promouen la transferència de compostos hidròfobs de la fase sòlida a la líquida. En aquest estudi s’escull el pirè com a representant dels hidrocarburs aromàtics policíclics i tres surfactants no iònics: un àmpliament citat a la bibliografia científica (Tween 80) i dos comercials (Gold Crew, BS-400). L’estudi es fa amb tres mescles d’argila i sorra amb diferents proporcions. La concentració micel·lar crítica (CMC) s’assoleix abans en sòls amb poc contingut en argila. L’eficiència dels surfactants està estretament relacionada amb la proporció d’argila i sorra. A concentracions molt per sobre de la CMC no s’observa una relació entre l’eficiència i la quantitat d’argila. El Tween 80 ha donat millors resultats que el Gold Crew i el BS-400, sense que aquest darrer no hagi presentat desadsorció de pirè.
Resumo:
Aquest projecte descriu la fusió de les necessitats diaries de monitorització del experiment ATLAS des del punt de vista del cloud. La idea principal es desenvolupar un conjunt de col·lectors que recullin informació de la distribució i processat de les dades i dels test de wlcg (Service Availability Monitoring), emmagatzemant-la en BBDD específiques per tal de mostrar els resultats en una sola pàgina HLM (High Level Monitoring). Un cop aconseguit, l’aplicació ha de permetre investigar més enllà via interacció amb el front-end, el qual estarà alimentat per les estadístiques emmagatzemades a la BBDD.
Resumo:
Järvholm and Co-workers (2009) proposed a conceptual model for research on working life. Models are powerful communication and decision tools. This model is strongly unidirectional and does not cover the mentioned interactions in the arguments.With help of a genealogy of work and of health it is shown that work and health are interactive and have to be analysed on the background of society.Key words: research model, work, health, occupational health, society, interaction, discussion paperRemodellierung der von Järvholm et al. (2009) vorgeschlagenen Forschungsperspektiven in Arbeit und GesundheitJärvholm und Kollegen stellten 2009 ein konzeptionelles Modell für die Forschung im Bereich Arbeit und Gesundheit vor. Modelle stellen kraftvolle Kommunikations- und Entscheidungsinstrumente dar. Die Einflussfaktoren im Modell verlaufen jedoch nur in einer Richtung und bilden die interaktiven Argumente im Text nicht ab. Mit Hilfe einer Genealogie der Begriffe Arbeit und Gesundheit wird aufgezeigt, dass Arbeit und Gesundheit sich gegenseitig beeinflussen und nur vor dem Hintergrund der jeweiligen gesellschaftlichen Kontextfaktoren zu analysieren sind.Introduction : After an interesting introduction about the objectives of research on working life, Järvholm and Co-workers (2009) manage to define a conceptual model for working life research out of a small survey of Occupational Safety and Health (OSH) definitions. The strong point of their model is the entity 'working life' including personal development, as well as career paths and aging. Yet, the model Järvholm et al. (2009) propose is strangely unidirectional; the arrows point from the population to working life, from there to health and to disease, as well as to productivity and economic resources. The diagram only shows one feed-back loop: between economic resources and health. We all know that having a chronic disease condition influences work and working capacity. Economic resources have a strong influence on work, too. Having personal economic resources will influence the kind of work someone accepts and facilitate access to continuous professional education. A third observation is that society is not present in the model, although this is less the case in the arguments. In fact, there is an incomprehensible gap between the arguments brought forth by Järvholm and co-workers and their reductionist model.Switzerland has a very low coverage of occupational health specialists. Switzerland is a long way from fulfilling the WHO's recommendations on workers' access to OSH services as described in its Global plan of action. The Institute for Work and Health (IST) in Lausanne is the only organisation which covers the major domains of OSH research that are occupational medicine, occupational hygiene, ergonomic and psychosocial research. As the country's sole occupational health institution we are forced to reflect the objectives of working life research so as not to waste the scare resources available.I will set out below a much shortened genealogy of work and of health, with the aim of extending Järvholm et al's (2009) analyses on the perspectives of working life research in two directions. Firstly towards the interactive nature of work and health and the integration of society, and secondly towards the question of what working life means or where working life could be situated.Work, as we know it today - paid work regulated by a contract as the basis for sustaining life and as a base for social rights - was born in modern era. Therefore I will start my genealogy in the pre-modern era, focus on the important changes that occurred during industrial revolution and the modern era and end in 2010 taking into account the enormous transformations of the past 20-30 years. I will put aside some 810 years of advances in science and technology that have expanded the world's limits and human understanding, and restrict my genealogy to work and to health/body implicating also the societal realm. [Author]
Resumo:
Purpose: We evaluated the potential for hybrid PET/MRI devices to provide integrated metabolic, functional and anatomic characterisation of patients with suspected coronary artery disease.Methods and Materials: Ten patients (5 with suspected hibernating myocardium and 5 healthy volunteers) performed an imaging study using a hybrid PET/MRI (Philips). Viability assessed by 18F-FDG was performed in diseased patients along with MRI anatomic and functional study and reassessed within 30 minutes by conventional PET/CT. Non-contrast right coronary artery (RCA) targeted and whole heart 3D coronary angio-MRI using ECG-gating and respiratory navigator was performed in healthy volunteers with reconstruction performed using MPR and volume rendering. The extent of metabolic defect (MD) using PET/MRI and PET/CT was compared in patients and coronary territories (LAD, CX, RCA). Assessability of coronary lumen was judged as good, sub-optimal or non-assessable using a 16-segments coronary model.Results: Metabolic assessment was successful in all patients with MD being 19.2% vs 18.3% using PET/MRI and PET/CT, respectively (P=ns). The MD was 10.2%, 6 %, and 3 % vs 9.3%, 6 % and 3 % for LAD, CX and RCA territories, respectively (P= ns). Coronary angio-MRI was successful in all volunteers with 66 coronary segments visualised overall. The RCA was fully visualised in 4/5 volunteers and the left coronary arteries in 4/5 volunteers. Assessability in visualised segments was good, sub-optimal and non-assessable in 88 %, 2 % and 10 %, respectively.Conclusion: Hybrid PET/MRI devices may enable metabolic evaluation comparable to PET/CT with additional value owing to accurate functional and anatomical information including coronary assessment.
Resumo:
Recently, there has been an increased interest on the neural mechanisms underlying perceptual decision making. However, the effect of neuronal adaptation in this context has not yet been studied. We begin our study by investigating how adaptation can bias perceptual decisions. We considered behavioral data from an experiment on high-level adaptation-related aftereffects in a perceptual decision task with ambiguous stimuli on humans. To understand the driving force behind the perceptual decision process, a biologically inspired cortical network model was used. Two theoretical scenarios arose for explaining the perceptual switch from the category of the adaptor stimulus to the opposite, nonadapted one. One is noise-driven transition due to the probabilistic spike times of neurons and the other is adaptation-driven transition due to afterhyperpolarization currents. With increasing levels of neural adaptation, the system shifts from a noise-driven to an adaptation-driven modus. The behavioral results show that the underlying model is not just a bistable model, as usual in the decision-making modeling literature, but that neuronal adaptation is high and therefore the working point of the model is in the oscillatory regime. Using the same model parameters, we studied the effect of neural adaptation in a perceptual decision-making task where the same ambiguous stimulus was presented with and without a preceding adaptor stimulus. We find that for different levels of sensory evidence favoring one of the two interpretations of the ambiguous stimulus, higher levels of neural adaptation lead to quicker decisions contributing to a speed–accuracy trade off.
Resumo:
Aims.We revisit the vicinity of the microquasar Cygnus X-3 at radio wavelengths. We aim to improve our previous search for possible associated extended radio features/hot spots in the position angle of the Cygnus X-3 relativistic jets focusing on shorter angular scales than previously explored. Methods.Our work is mostly based on analyzing modern survey and archive radio data, mainly including observations carried out with the Very Large Array and the Ryle Telescopes. We also used deep near-infrared images that we obtained in 2005. Results.We present new radio maps of the Cygnus X-3 field computed after combining multi-configuration Very Large Array archive data at 6 cm and different observing runs at 2 cm with the Ryle Telescope. These are probably among the deepest radio images of Cygnus X-3 reported to date at cm wavelengths. Both interferometers reveal an extended radio feature within a few arc-minutes of the microquasar position, thus making our detection more credible. Moreover, this extended emission is possibly non-thermal, although this point still needs confirmation. Its physical connection with the microquasar is tentatively considered under different physical scenarios. We also report on the serendipitous discovery of a likely Fanaroff-Riley type II radio galaxy only away from Cygnus X-3.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
OBJECTIVES: Specifically we aim to demonstrate that the results of our earlier safety data hold true in this much larger multi-national and multi-ethnical population. BACKGROUND: We sought to re-evaluate the frequency, manifestations, and severity of acute adverse reactions associated with administration of several gadolinium- based contrast agents during routine CMR on a European level. METHODS: Multi-centre, multi-national, and multi-ethnical registry with consecutive enrolment of patients in 57 European centres. RESULTS: During the current observation 37,788 doses of Gadolinium based contrast agent were administered to 37,788 patients. The mean dose was 24.7 ml (range 5-80 ml), which is equivalent to 0.123 mmol/kg (range 0.01 - 0.3 mmol/kg). Forty-five acute adverse reactions due to contrast administration occurred (0.12%). Most reactions were classified as mild (43 of 45) according to the American College of Radiology definition. The most frequent complaints following contrast administration were rashes and hives (15 of 45), followed by nausea (10 of 45) and flushes (10 of 45). The event rate ranged from 0.05% (linear non-ionic agent gadodiamide) to 0.42% (linear ionic agent gadobenate dimeglumine). Interestingly, we also found different event rates between the three main indications for CMR ranging from 0.05% (risk stratification in suspected CAD) to 0.22% (viability in known CAD). CONCLUSIONS: The current data indicate that the results of the earlier safety data hold true in this much larger multi-national and multi-ethnical population. Thus, the "off-label" use of Gadolinium based contrast in cardiovascular MR should be regarded as safe concerning the frequency, manifestation and severity of acute events.
Resumo:
This review deals with the general use of the surfactants in Analytical Chemistry. Principal characteristic of the micelle is the improvement in selectivity and/or sensitivity of the analytical determination with emphasis on the catalytic reaction and "cloud point" extraction.
Resumo:
The aim of this work was to explore the possibility of the application of a non-ionic resin obtained by impregnation of Alizarin Red S (VAS) in Amberlite XAD-7 for manganese, copper and zinc separation and preconcentration in saline matrices. For these system, the metals were quantitatively retained, in the pH range 8.5-10.0, by using 0.50 g of solid phase, stirring time of five minutes and a total mass up to 200 mug of each cation. The sorbed elements were subsequently eluted and a fifty-fold, ten-fold and ten-fold preconcentration factor for to Zn, Cu and Mn were obtained, respectively.