932 resultados para thrombocyte concentrate
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Dissertação de mestrado em Direito Administrativo
Resumo:
Aromatic amines are widely used industrial chemicals as their major sources in the environment include several chemical industry sectors such as oil refining, synthetic polymers, dyes, adhesives, rubbers, perfume, pharmaceuticals, pesticides and explosives. They result also from diesel exhaust, combustion of wood chips and rubber and tobacco smoke. Some types of aromatic amines are generated during cooking, special grilled meat and fish, as well. The intensive use and production of these compounds explains its occurrence in the environment such as in air, water and soil, thereby creating a potential for human exposure. Since aromatic amines are potential carcinogenic and toxic agents, they constitute an important class of environmental pollutants of enormous concern, which efficient removal is a crucial task for researchers, so several methods have been investigated and applied. In this chapter the types and general properties of aromatic amine compounds are reviewed. As aromatic amines are continuously entering the environment from various sources and have been designated as high priority pollutants, their presence in the environment must be monitored at concentration levels lower than 30 mg L1, compatible with the limits allowed by the regulations. Consequently, most relevant analytical methods to detect the aromatic amines composition in environmental matrices, and for monitoring their degradation, are essential and will be presented. Those include Spectroscopy, namely UV/visible and Fourier Transform Infrared Spectroscopy (FTIR); Chromatography, in particular Thin Layer (TLC), High Performance Liquid (HPLC) and Gas chromatography (GC); Capillary electrophoresis (CE); Mass spectrometry (MS) and combination of different methods including GC-MS, HPLC-MS and CE-MS. Choosing the best methods depend on their availability, costs, detection limit and sample concentration, which sometimes need to be concentrate or pretreated. However, combined methods may give more complete results based on the complementary information. The environmental impact, toxicity and carcinogenicity of many aromatic amines have been reported and are emphasized in this chapter too. Lately, the conventional aromatic amines degradation and the alternative biodegradation processes are highlighted. Parameters affecting biodegradation, role of different electron acceptors in aerobic and anaerobic biodegradation and kinetics are discussed. Conventional processes including extraction, adsorption onto activated carbon, chemical oxidation, advanced oxidation, electrochemical techniques and irradiation suffer from drawbacks including high costs, formation of hazardous by-products and low efficiency. Biological processes, taking advantage of the naturally processes occurring in environment, have been developed and tested, proved as an economic, energy efficient and environmentally feasible alternative. Aerobic biodegradation is one of the most promising techniques for aromatic amines remediation, but has the drawback of aromatic amines autooxidation once they are exposed to oxygen, instead of their degradation. Higher costs, especially due to power consumption for aeration, can also limit its application. Anaerobic degradation technology is the novel path for treatment of a wide variety of aromatic amines, including industrial wastewater, and will be discussed. However, some are difficult to degrade under anaerobic conditions and, thus, other electron acceptors such as nitrate, iron, sulphate, manganese and carbonate have, alternatively, been tested.
Resumo:
Dissertação de mestrado em Ciências da Linguagem
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
We analyze the incentives for cooperation of three players differing in their efficiency of effort in a contest game. We concentrate on the non-cooperative bargaining foundation of coalition formation, and therefore, we adopt a two-stage model. In the first stage, individuals form coalitions following a bargaining protocol similar to the one proposed by Gul (1989). Afterwards, coalitions play the contest game of Esteban and Ray (1999) within the resulting coalition structure of the first stage. We find that the grand coalition forms whenever the distribution of the bargaining power in the coalition formation game is equal to the distribution of the relative efficiency of effort. Finally, we use the case of equal bargaining power for all individuals to show that other types of coalition structures may be observed as well.
Resumo:
Quantitative method of viral pollution determination for large volume of water using ferric hydroxide gel impregnated on the surface of glassfibre cartridge filter. The use of ferric hydroxide gel, impregnated on the surface of glassfibre cartridge filter enable us to recover 62.5% of virus (Poliomylitis type I, Lsc strain) exsogeneously added to 400 liters of tap-water. The virus concentrator system consists of four cartridge filters, in which the three first one are clarifiers, where the contaminants are removed physically, without significant virus loss at this stage. The last cartridge filter is impregnated with ferric hydroxide gel, where the virus is adsorbed. After the required volume of water has been processed, the last filter is removed from the system and the viruses are recovered from the gel, using 1 liter of glycine/NaOH buffer, at pH 11. Immediately the eluate is clarified through series of cellulose acetate membranes mounted in a 142mm Millipore filter. For the second step of virus concentration, HC1 1N is added slowly to the eluate to achieve pH 3.5-4. MgC1, is added to give a final concentration of 0.05M and the viruses are readsorbed on a 0.45 , porosity (HA) cellulose acetate membrane, mounted in a 90 mm Millipore filter. The viruses are recovered using the same eluent plus 10% of fetal calf serum, to a final volume of 3 ml. In this way, it was possible to concentrate virus from 400 liters of tap-water, into 1 liter in the first stage of virus concentration and just to 3 ml of final volume in a second step. The efficiency, simplicity and low operational cost, provded by the method, make it feasible to study viral pollution of recreational and tap-water sources.
Resumo:
Brain metastases occur in 20-50% of NSCLC and 50-80% of SCLC. In this review, we will look at evidence-based medicine data and give some perspectives on the management of BM. We will address the problems of multiple BM, single BM and prophylactic cranial irradiation. Recursive Partitioning Analysis (RPA) is a powerful prognostic tool to facilitate treatment decisions. Dealing with multiple BM, the use of corticosteroids was established more than 40 years ago by a unique randomized trial (RCT). Palliative effect is high (_80%) as well as side-effects. Whole brain radiotherapy (WBRT) was evaluated in many RCTs with a high (60-90%) response rate; several RT regimes are equivalent, but very high dose per fraction should be avoided. In multiple BM from SCLC, the effect of WBRT is comparable to that in NSCLC but chemotherapy (CXT) although advocated is probably less effective than RT. Single BM from NSCLC occurs in 30% of all BM cases; several prognostic classifications including RPA are very useful. Several options are available in single BM: WBRT, surgery (SX), radiosurgery (RS) or any combination of these. All were studied in RCTs and will be reviewed: the addition of WBRT to SX or RS gives a better neurological tumour control, has little or no impact on survival, and may be more toxic. However omitting WBRT after SX alone gives a higher risk of cerebro-spinal fluid dissemination. Prophylactic cranial irradiation (PCI) has a major role in SCLC. In limited disease, meta-analyses have shown a positive impact of PCI in the decrease of brain relapse and in survival improvement, especially for patients in complete remission. Surprisingly, this has been recently confirmed also in extensive disease. Experience with PCI for NSCLC is still limited, but RCT suggest a reduction of BM with no impact on survival. Toxicity of PCI is a matter of debate, as neurological or neuro-cognitive impairment is already present prior to PCI in almost half of patients. However RT toxicity is probably related to total dose and dose per fraction. Perspectives : Future research should concentrate on : 1) combined modalities in multiple BM. 2) Exploration of treatments in oligo-metastases. 3) Further exploration of PCI in NSCLC. 4) Exploration of new, toxicity-sparing radiotherapy techniques (IMRT, Tomotherapy etc).
Resumo:
Capital taxation is currently under debate, basically due to problems of administrative control and proper assessment of the levied assets. We analyze both problems focusing on a capital tax, the annual wealth tax (WT), which is only applied in five OECD countries, being Spain one of them. We concentrate our analysis on top 1% adult population, which permits us to describe the evolution of wealth concentration in Spain along 1983-2001. On average top 1% holds about 18% of total wealth, which rises to 19% when tax incompliance and under-assessment is corrected for housing, the main asset. The evolution suggests wealth concentration has risen. Regarding WT, we analyze whether it helps to reduce wealth inequality or, on the contrary, it reinforces vertical inequity (due to especial concessions) and horizontal inequity (due to the de iure and to de facto different treatment of assets). We analyze in detail housing and equity shares. By means of a time series analysis, we relate the reported values with reasonable price indicators and proxies of the propensity to save. We infer net tax compliance is extremely low, which includes both what we commonly understand by (gross) tax compliance and the degree of under-assessment due to fiscal legislation (for housing). That is especially true for housing, whose level of net tax compliance is well below 50%. Hence, we corroborate the difficulties in taxing capital, and so cast doubts on the current role of the WT in Spain in reducing wealth inequality.
Resumo:
The latent membrane protein 1 (LMP1) encoded by the Epstein-Barr virus acts like a constitutively activated receptor of the tumor necrosis factor receptor (TNFR) family and is enriched in lipid rafts. We showed that LMP1 is targeted to lipid rafts in transfected HEK 293 cells, and that the endogenous TNFR-associated factor 3 binds LMP1 and is recruited to lipid rafts upon LMP1 expression. An LMP1 mutant lacking the C-terminal 55 amino acids (Cdelta55) behaves like the wild-type (WT) LMP1 with respect to membrane localization. In contrast, a mutant with a deletion of the 25 N-terminal residues (Ndelta25) does not concentrate in lipid rafts but still binds TRAF3, demonstrating that cell localization of LMP1 was not crucial for TRAF3 localization. Moreover, Ndelta25 inhibited WT LMP1-mediated induction of the transcription factors NF-kappaB and AP-1. Morphological data indicate that Ndelta25 hampers WT LMP1 plasma membrane localization, thus blocking LMP1 function.
Resumo:
Cypermethrin (4 g/l, 5 g/l wettable powder and 7 ml/l, 10 ml/l emulsifiable concentrate) was tested, under laboratory conditions, against the adult Musca domestica. As a standard for comparison, a 6 ml/l concentrate suspension formulation of deltamethrin was used. One and twenty-four hours after application, mortality counts showed that the substances under test killed, respectively, more than 80% and 85% of the exposed insects. Under the conditions of the test, cypermethrin was considered effective in the control of the house fly.
Resumo:
Estudi realitzat a partir d’una estada a la Université de Poitiers, França, entre 2007 i 2009. El treball s'ha centrat en dues activitats bàsiques. El treball realitzat s’ha centrat en dues activitats bàsiques. D’una banda, la posada a punt d'un protocol de fraccionament de la matèria orgànica del sòl, per extraccions successives amb solvents alcalins després d'una seqüència de pretractaments al sòl: cap pretractament, atac amb àcid (per destruir els carbonats), atac amb ditionit (per reduir els òxids de Fe i Al i facilitar l'extracció de la matèria orgànica associada a aquests compostos). El protocol dóna una visió de conjunt de la situació de la matèria orgànica del sòl, combinant aspectes físics (protecció, precipitació, oclusió per carbonats) i químics (grau d'humificació). D’altra banda, l'aprenentatge de la tècnica de termoquimiolisi-cromatografia de gasos-espectrometria de masses. Aquest era l'objectiu de l'estada a Poitiers, al qual hem donat prioritat. Ens hem centrat en l'estudi de fraccions físiques (densimètriques) obtingudes en estudis anteriors sobre sòls forestals. Les fraccions considerades són: fracció lleugera (FL), tres fraccions ocluïdes (OC1, OC2 i OC3) i fracció densa (FD). L’aplicació de la termoquimiolisi permet de caracteritzar diversos grups de substàncies, de les quals ens hem centrat en alguns indicadors bioquímics: àcids grasos, alcohols, diàcids, productes fenòlics i altres productes aromàtics, derivats de carbohidrats. L’estudi de conjunt d’aquests productes indica que és a les fraccions ocluïdes (que solen ser minoritàries a tots els horitzons) on la matèria orgànica d’origen microbià és dominant, mentre que a les fraccions lleugera (FL) i densa (FD) la matèria orgànica d’origen vegetal sembla dominant. Es preveu aplicar aquesta tècnica a l’estudi de les fraccions obtingudes a la primera part del treball, actualment congelades i a l’espera de ser processades.
Resumo:
Cowpea (Vigna unguiculata) seeds are heavily damaged during storage by the bruchid Callosobruchus maculatus. Seeds of some Nigerian varieties showed a strong resistance to this bruchid. By utilizing biochemical and entomological techniques we were able to rule out the paticipation of proteolytic enzyme (trypsin, chimotrypsin, subtilisin and papain) inhibitors, lectins, and tannins in the resistance mechanisms. Fractionation of the seed meal of a resistant variety suggests that the factor(s) responsible for the effect is (are) concentrate in the globulin fraction.
Resumo:
Suburbanization is changing the urban spatial structure and less monocentric metropolitan regions are becoming the new urban reality. Focused only on centers, most works have studied these spatial changes neglecting the role of transport infrastructure and its related location model, the “accessibility city”, in which employment and population concentrate in low-density settlements and close to transport infrastructure. For the case of Barcelona, we consider this location model and study the population spatial structure between 1991 and 2006. The results reveal a mix between polycentricity and the accessibility city, with movements away from the main centers, but close to the transport infrastructure.
Resumo:
In this paper, we analyse the asymptotic behavior of solutions of the continuous kinetic version of flocking by Cucker and Smale [16], which describes the collective behavior of an ensemble of organisms, animals or devices. This kinetic version introduced in [24] is here obtained starting from a Boltzmann-type equation. The large-time behavior of the distribution in phase space is subsequently studied by means of particle approximations and a stability property in distances between measures. A continuous analogue of the theorems of [16] is shown to hold for the solutions on the kinetic model. More precisely, the solutions will concentrate exponentially fast their velocity to their mean while in space they will converge towards a translational flocking solution.