931 resultados para Barrier-free design.
Resumo:
As fluoroquinolonas são antibióticos que têm um largo espectro de ação contra bactérias, especialmente Gram-negativas. O seu mecanismo de ação assenta na inibição de enzimas responsáveis pela replicação do DNA. Porém, devido ao seu uso indevido, o surgimento de resistência bacteriana a estes antibióticos tem-se tornado um grave problema de saúde pública. Uma vez que os seus alvos de ação se situam no meio intracelular, a redução da permeabilidade da membrana externa de bactérias Gram-negativas constitui um dos mecanismos de resistência mais conhecidos. Esta redução é associada à baixa expressão ou mutações em porinas necessárias para permitir o seu transporte, mais concretamente, da OmpF. Estudos prévios demonstraram que a coordenação de fluoroquinolonas com iões metálicos divalentes e 1,10-fenantrolina (genericamente designados metaloantibióticos) são potenciais candidatos como alternativa às fluoroquinolonas convencionais. Estes metaloantibióticos exibem um efeito antimicrobiano comparável ou superior à fluoroquinolona na forma livre, mas parecem ter uma via de translocação diferente, independente de porinas. Estas diferenças no mecanismo de captura podem ser fundamentais para contornar a resistência bacteriana. De forma a compreender o papel dos lípidos no mecanismo de entrada dos metaloantibióticos, estudou-se a interação e localização dos metaloantibióticos da Ciprofloxacina (2ª geração), da Levofloxacina (3ª geração) e Moxifloxacina (4ª geração) com um modelo de membranas de Escherichia coli desprovido de porinas. Estes estudos foram realizados através de técnicas de espectroscopia de fluorescência, por medições em modo estacionário e resolvida no tempo. Os coeficientes de partição determinados demonstraram uma interação mais elevada dos metaloantibióticos relativamente às respetivas fluoroquinolonas na forma livre, um facto que está diretamente relacionado com as espécies existentes em solução a pH fisiológico. Os estudos de localização mostraram que estes metaloantibióticos devem estar inseridos na membrana bacteriana, confirmando a sua entrada independente de porinas. Este mecanismo de entrada, pela via hidrofóbica, é potenciado por interações eletrostáticas entre as espécies catiónicas de metaloantibiótico que existem a pH 7,4 e os grupos carregados negativamente dos fosfolípidos da membrana. Desta forma, os resultados obtidos neste estudo sugerem que a via de entrada dos metaloantibióticos e das respetivas fluoroquinolonas deve ser diferente. Os metaloantibióticos são candidatos adequados para a realização de mais testes laboratoriais e uma alternativa promissora para substituir as fluoroquinolonas convencionais, uma vez que parecem ultrapassar um dos principais mecanismos de resistência bacteriana a esta classe de antibióticos.
Resumo:
This dissertation presents the design of three high-performance successive-approximation-register (SAR) analog-to-digital converters (ADCs) using distinct digital background calibration techniques under the framework of a generalized code-domain linear equalizer. These digital calibration techniques effectively and efficiently remove the static mismatch errors in the analog-to-digital (A/D) conversion. They enable aggressive scaling of the capacitive digital-to-analog converter (DAC), which also serves as sampling capacitor, to the kT/C limit. As a result, outstanding conversion linearity, high signal-to-noise ratio (SNR), high conversion speed, robustness, superb energy efficiency, and minimal chip-area are accomplished simultaneously. The first design is a 12-bit 22.5/45-MS/s SAR ADC in 0.13-μm CMOS process. It employs a perturbation-based calibration based on the superposition property of linear systems to digitally correct the capacitor mismatch error in the weighted DAC. With 3.0-mW power dissipation at a 1.2-V power supply and a 22.5-MS/s sample rate, it achieves a 71.1-dB signal-to-noise-plus-distortion ratio (SNDR), and a 94.6-dB spurious free dynamic range (SFDR). At Nyquist frequency, the conversion figure of merit (FoM) is 50.8 fJ/conversion step, the best FoM up to date (2010) for 12-bit ADCs. The SAR ADC core occupies 0.06 mm2, while the estimated area the calibration circuits is 0.03 mm2. The second proposed digital calibration technique is a bit-wise-correlation-based digital calibration. It utilizes the statistical independence of an injected pseudo-random signal and the input signal to correct the DAC mismatch in SAR ADCs. This idea is experimentally verified in a 12-bit 37-MS/s SAR ADC fabricated in 65-nm CMOS implemented by Pingli Huang. This prototype chip achieves a 70.23-dB peak SNDR and an 81.02-dB peak SFDR, while occupying 0.12-mm2 silicon area and dissipating 9.14 mW from a 1.2-V supply with the synthesized digital calibration circuits included. The third work is an 8-bit, 600-MS/s, 10-way time-interleaved SAR ADC array fabricated in 0.13-μm CMOS process. This work employs an adaptive digital equalization approach to calibrate both intra-channel nonlinearities and inter-channel mismatch errors. The prototype chip achieves 47.4-dB SNDR, 63.6-dB SFDR, less than 0.30-LSB differential nonlinearity (DNL), and less than 0.23-LSB integral nonlinearity (INL). The ADC array occupies an active area of 1.35 mm2 and dissipates 30.3 mW, including synthesized digital calibration circuits and an on-chip dual-loop delay-locked loop (DLL) for clock generation and synchronization.
Resumo:
Chapter 1: Under the average common value function, we select almost uniquely the mechanism that gives the seller the largest portion of the true value in the worst situation among all the direct mechanisms that are feasible, ex-post implementable and individually rational. Chapter 2: Strategy-proof, budget balanced, anonymous, envy-free linear mechanisms assign p identical objects to n agents. The efficiency loss is the largest ratio of surplus loss to efficient surplus, over all profiles of non-negative valuations. The smallest efficiency loss is uniquely achieved by the following simple allocation rule: assigns one object to each of the p−1 agents with the highest valuation, a large probability to the agent with the pth highest valuation, and the remaining probability to the agent with the (p+1)th highest valuation. When “envy freeness” is replaced by the weaker condition “voluntary participation”, the optimal mechanism differs only when p is much less than n. Chapter 3: One group is to be selected among a set of agents. Agents have preferences over the size of the group if they are selected; and preferences over size as well as the “stand-outside” option are single-peaked. We take a mechanism design approach and search for group selection mechanisms that are efficient, strategy-proof and individually rational. Two classes of such mechanisms are presented. The proposing mechanism allows agents to either maintain or shrink the group size following a fixed priority, and is characterized by group strategy-proofness. The voting mechanism enlarges the group size in each voting round, and achieves at least half of the maximum group size compatible with individual rationality.
Resumo:
Local anesthetic agents cause temporary blockade of nerve impulses productiong insensitivity to painful stimuli in the area supplied by that nerve. Bupivacaine (BVC) is an amide-type local anesthetic widely used in surgery and obstetrics for sustained peripheral and central nerve blockade. in this study, we prepared and characterized nanosphere formulations containing BVC. To achieve these goals, BVC loaded poly(DL-lactide-co-glycolide) (PLGA) nanospheres (NS) were prepared by nanopreciptation and characterized with regard to size distribution, drug loading and cytotoxicity assays. The 2(3-1) factorial experimental design was used to study the influence of three different independent variables on nanoparticle drug loading. BVC was assayed by HPLC, the particle size and zeta potential were determined by dynamic light scattering. BVC was determined using a combined ultrafiltration-centrifugation technique. The results of optimized formulations showed a narrow size distribution with a polydispersivity of 0.05%, an average diameter of 236.7 +/- 2.6 nm and the zeta potential -2.93 +/- 1,10 mV. In toxicity studies with fibroblast 3T3 cells, BVC loaded-PLGA-NS increased cell viability, in comparison with the effect produced by free BVC. In this way, BVC-loaded PLGA-NS decreased BVC toxicity. The development of BVC formulations in carriers such as nanospheres could offer the possibility of controlling drug delivery in biological systems, prolonging the anesthetic effect and reducing toxicity.
Resumo:
Optimization of Carnobacterium divergens V41 growth and bacteriocin activity in a culture medium deprived of animal protein, needs for food bioprotection, was performed by using a statistical approach. In a screening experiment, twelve factors (pH, temperature, carbohydrates, NaCl, yeast extract, soy peptone, sodium acetate, ammonium citrate, magnesium sulphate, manganese sulphate, ascorbic acid and thiamine) were tested for their influence on the maximal growth and bacteriocin activity using a two-level incomplete factorial design with 192 experiments performed in microtiter plate wells. Based on results, a basic medium was developed and three variables (pH, temperature and carbohydrates concentration) were selected for a scale-up study in bioreactor. A 23 complete factorial design was performed, allowing the estimation of linear effects of factors and all the first order interactions. The best conditions for the cell production were obtained with a temperature of 15°C and a carbohydrates concentration of 20 g/l whatever the pH (in the range 6.5-8), and the best conditions for bacteriocin activity were obtained at 15°C and pH 6.5 whatever the carbohydrates concentration (in the range 2-20 g/l). The predicted final count of C. divergens V41 and the bacteriocin activity under the optimized conditions (15°C, pH 6.5, 20 g/l carbohydrates) were 2.4 x 1010 CFU/ml and 819200 AU/ml respectively. C. divergens V41 cells cultivated in the optimized conditions were able to grow in cold-smoked salmon and totally inhibited the growth of Listeria monocytogenes (< 50 CFU g-1) during five weeks of vacuum storage at 4° and 8°C.
Resumo:
Aims: Hyperglycaemia (HG), in stroke patients, is associated with worse neurological outcome by compromising endothelial cell function and the blood–brain barrier (BBB) integrity. We have studied the contribution of HG-mediated generation of oxidative stress to these pathologies and examined whether antioxidants as well as normalization of glucose levels following hyperglycaemic insult reverse these phenomena. Methods: Human brain microvascular endothelial cell (HBMEC) and human astrocyte co-cultures were used to simulate the human BBB. The integrity of the BBB was measured by transendothelial electrical resistance using STX electrodes and an EVOM resistance meter, while enzyme activities were measured by specific spectrophotometric assays. Results: After 5 days of hyperglycaemic insult, there was a significant increase in BBB permeability that was reversed by glucose normalization. Co-treatment of cells with HG and a number of antioxidants including vitamin C, free radical scavengers and antioxidant enzymes including catalase and superoxide dismutase mimetics attenuated the detrimental effects of HG. Inhibition of p38 mitogen-activated protein kinase (p38MAPK) and protein kinase C but not phosphoinositide 3 kinase (PI3 kinase) also reversed HG-induced BBB hyperpermeability. In HBMEC, HG enhanced pro-oxidant (NAD(P)H oxidase) enzyme activity and expression that were normalized by reverting to normoglycaemia. Conclusions: HG impairs brain microvascular endothelial function through involvements of oxidative stress and several signal transduction pathways.
Resumo:
International audience
Resumo:
In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.
Resumo:
There is a growing recognition of the importance of the commensal intestinal microbiota in the development and later function of the central nervous system. Research using germ-free mice (mice raised without any exposure to microorganisms) has provided some of the most persuasive evidence for a role of these bacteria in gut-brain signalling. Key findings show that the microbiota is necessary for normal stress responsivity, anxiety-like behaviors, sociability, and cognition. Furthermore, the microbiota maintains central nervous system homeostasis by regulating immune function and blood brain barrier integrity. Studies have also found that the gut microbiota influences neurotransmitter, synaptic, and neurotrophic signalling systems and neurogenesis. The principle advantage of the germ-free mouse model is in proof-of-principle studies and that a complete microbiota or defined consortiums of bacteria can be introduced at various developmental time points. However, a germ-free upbringing can induce permanent neurodevelopmental deficits that may deem the model unsuitable for specific scientific queries that do not involve early-life microbial deficiency. As such, alternatives and complementary strategies to the germ-free model are warranted and include antibiotic treatment to create microbiota-deficient animals at distinct time points across the lifespan. Increasing our understanding of the impact of the gut microbiota on brain and behavior has the potential to inform novel management strategies for stress-related gastrointestinal and neuropsychiatric disorders.
Resumo:
Metal oxide protection layers for photoanodes may enable the development of large-scale solar fuel and solar chemical synthesis, but the poor photovoltages often reported so far will severely limit their performance. Here we report a novel observation of photovoltage loss associated with a charge extraction barrier imposed by the protection layer, and, by eliminating it, achieve photovoltages as high as 630mV, the maximum reported so far for water-splitting silicon photoanodes. The loss mechanism is systematically probed in metal-insulator-semiconductor Schottky junction cells compared to buried junction p(+) n cells, revealing the need to maintain a characteristic hole density at the semiconductor/insulator interface. A leaky-capacitor model related to the dielectric properties of the protective oxide explains this loss, achieving excellent agreement with the data. From these findings, we formulate design principles for simultaneous optimization of built-in field, interface quality, and hole extraction to maximize the photovoltage of oxide-protected water-splitting anodes.
Resumo:
Shearing is the process where sheet metal is mechanically cut between two tools. Various shearing technologies are commonly used in the sheet metal industry, for example, in cut to length lines, slitting lines, end cropping etc. Shearing has speed and cost advantages over competing cutting methods like laser and plasma cutting, but involves large forces on the equipment and large strains in the sheet material. The constant development of sheet metals toward higher strength and formability leads to increased forces on the shearing equipment and tools. Shearing of new sheet materials imply new suitable shearing parameters. Investigations of the shearing parameters through live tests in the production are expensive and separate experiments are time consuming and requires specialized equipment. Studies involving a large number of parameters and coupled effects are therefore preferably performed by finite element based simulations. Accurate experimental data is still a prerequisite to validate such simulations. There is, however, a shortage of accurate experimental data to validate such simulations. In industrial shearing processes, measured forces are always larger than the actual forces acting on the sheet, due to friction losses. Shearing also generates a force that attempts to separate the two tools with changed shearing conditions through increased clearance between the tools as result. Tool clearance is also the most common shearing parameter to adjust, depending on material grade and sheet thickness, to moderate the required force and to control the final sheared edge geometry. In this work, an experimental procedure that provides a stable tool clearance together with accurate measurements of tool forces and tool displacements, was designed, built and evaluated. Important shearing parameters and demands on the experimental set-up were identified in a sensitivity analysis performed with finite element simulations under the assumption of plane strain. With respect to large tool clearance stability and accurate force measurements, a symmetric experiment with two simultaneous shears and internal balancing of forces attempting to separate the tools was constructed. Steel sheets of different strength levels were sheared using the above mentioned experimental set-up, with various tool clearances, sheet clamping and rake angles. Results showed that tool penetration before fracture decreased with increased material strength. When one side of the sheet was left unclamped and free to move, the required shearing force decreased but instead the force attempting to separate the two tools increased. Further, the maximum shearing force decreased and the rollover increased with increased tool clearance. Digital image correlation was applied to measure strains on the sheet surface. The obtained strain fields, together with a material model, were used to compute the stress state in the sheet. A comparison, up to crack initiation, of these experimental results with corresponding results from finite element simulations in three dimensions and at a plane strain approximation showed that effective strains on the surface are representative also for the bulk material. A simple model was successfully applied to calculate the tool forces in shearing with angled tools from forces measured with parallel tools. These results suggest that, with respect to tool forces, a plane strain approximation is valid also at angled tools, at least for small rake angles. In general terms, this study provide a stable symmetric experimental set-up with internal balancing of lateral forces, for accurate measurements of tool forces, tool displacements, and sheet deformations, to study the effects of important shearing parameters. The results give further insight to the strain and stress conditions at crack initiation during shearing, and can also be used to validate models of the shearing process.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
Para se efectuar a investigação sobre a joalharia portuguesa contemporânea, a partir de 1950, procedeu-se ao seu enquadramento histórico e geográfico. À semelhança do que se verificava resto do mundo, constata-se uma interdisciplinaridade com as restantes artes plásticas (em especial com o design). E consequente alteração das formas, materiais e técnicas anteriormente usadas. Também ao nível da função da jóia se observaram alterações. Foi, então, estudada a relação da joia com o corpo. Portugal conta com um conjunto de artistas joalheiros em plena produção. Existem estruturas de ensino e profissionais, sem, contudo, haver um museu de joalharia contemporânea. Mesmo assim, é de notar o elevado número de exposições e projectos relacionados com esta temática. Neste panorama, surge uma joalharia contemporânea liberta de uma identidade, que permite que o trabalho de cada artista seja singular, sem obedecer a regras limitativas ou castradoras da criatividade. /ABSTRACT: It was during the 50s that in order to carry out the research on the contemporary portuguese jewellery, one had to establish its historical and geographical approach. As it happened in the rest of the world, one may say there was an interdisciplinary with the other fine arts (specially with the design) and the consequent change in the shapes, materials and techniques previously used. The jewel was not an exception to these changes as far its function is concerned. Due to these factors, the relationship between the jewel and the body started to be studied. Nowadays Portugal counts a set of jewelers in full production. There are specialized schools in this area as well as professionals although there isn't a museum of contemporary jeweler. ln spite of this, a huge number of exhibitions and projects concerning this area, take place in Portugal. Taking all these aspects into consideration, a free contemporary jewelry is emerging, allowing the work of each artist may be unique without obeying to any strict or limitative rules that could work as a barrier to their creativity.