927 resultados para Low cost process


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nervous system disorders are associated with cognitive and motor deficits, and are responsible for the highest disability rates and global burden of disease. Their recovery paths are vulnerable and dependent on the effective combination of plastic brain tissue properties, with complex, lengthy and expensive neurorehabilitation programs. This work explores two lines of research, envisioning sustainable solutions to improve treatment of cognitive and motor deficits. Both projects were developed in parallel and shared a new sensible approach, where low-cost technologies were integrated with common clinical operative procedures. The aim was to achieve more intensive treatments under specialized monitoring, improve clinical decision-making and increase access to healthcare. The first project (articles I – III) concerned the development and evaluation of a web-based cognitive training platform (COGWEB), suitable for intensive use, either at home or at institutions, and across a wide spectrum of ages and diseases that impair cognitive functioning. It was tested for usability in a memory clinic setting and implemented in a collaborative network, comprising 41 centers and 60 professionals. An adherence and intensity study revealed a compliance of 82.8% at six months and an average of six hours/week of continued online cognitive training activities. The second project (articles IV – VI) was designed to create and validate an intelligent rehabilitation device to administer proprioceptive stimuli on the hemiparetic side of stroke patients while performing ambulatory movement characterization (SWORD). Targeted vibratory stimulation was found to be well tolerated and an automatic motor characterization system retrieved results comparable to the first items of the Wolf Motor Function Test. The global system was tested in a randomized placebo controlled trial to assess its impact on a common motor rehabilitation task in a relevant clinical environment (early post-stroke). The number of correct movements on a hand-to-mouth task was increased by an average of 7.2/minute while the probability to perform an error decreased from 1:3 to 1:9. Neurorehabilitation and neuroplasticity are shifting to more neuroscience driven approaches. Simultaneously, their final utility for patients and society is largely dependent on the development of more effective technologies that facilitate the dissemination of knowledge produced during the process. The results attained through this work represent a step forward in that direction. Their impact on the quality of rehabilitation services and public health is discussed according to clinical, technological and organizational perspectives. Such a process of thinking and oriented speculation has led to the debate of subsequent hypotheses, already being explored in novel research paths.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En este trabajo se presenta la descripción e investigación en la evaluación de vetiver (Chrysopogon zizanioides) y la elefanta (Pennisetum purpureum) en el diseño de humedales artificiales. Para el tratamiento de aguas residuales de origen doméstico, siendo la vegetación uno de los principales componentes de estos sistemas de tratamientos no convencionales. Muchos \sistemas naturales" están siendo considerados con el propósito del tratamiento del agua residual y control de la contaminación del agua, debido a su alta fiabilidad ambiental y los bajos costos de construcción y mantenimiento, es el caso de los humedales artificiales. El interés en los sistemas naturales está basado en la conservación de los recursos asociados con estos sistemas como opuesto al proceso de tratamiento convencional de aguas residuales que es intensivo respecto al uso de energía y químicos. Los wetlands o humedales artificiales constituyen una alternativa de tratamiento debido a su alta eficiencia de remoción de contaminantes, a su bajo costo de instalación y mantenimiento y a su alta fiabilidad ambiental, generalmente un humedal artificial esta constituido por un medio de soporte el cual generalmente es arena o grava, vegetación y microorganismos o biopelícula los cuales llevan los diferentes procesos bioquímicos para remover los contaminantes del afluente. El objetivo general de este trabajo ha sido: Evaluar la eficiencia de remoción de materia orgánica, sólidos, nitrógeno y fósforo total de dos especies de plantas: vetiver (Chrysopogon zizanioides) y la elefanta (Pennisetum purpureum), en el diseño de humedales artificiales para el tratamiento de aguas residuales de origen doméstico. Los humedales artificiales o sistemas pilotos, se encuentran ubicados en la universidad de Medellín y reciben una preparación de agua sintética, que asemeja a las características de un agua residual de origen doméstico. En el presente trabajo se evalúa el porcentaje de remoción de la carga orgánica de aguas residuales, en un sistema de tratamiento por humedales artificiales con dos especies vegetales. El sistema fue diseñado con tres módulos instalados de manera adjunta. En el primero no se integra ninguna especie vegetal, solo el medio de sustrato el cual constituye el blanco (-), en el segundo se integraron organismos de la especie vetiver (Chrysopogon zizanioides), en el tercer sistema piloto, organismos de la especie elefanta (Pennisetum purpureum) y en el cuarto organismos de la especie papiro japones (Cyperus alternifolius), los cuales constituyen el control positivo (+). Los módulos experimentales fueron limpiados, cortados y adecuados acorde al montaje inicial de las plantas y al espacio requerido para su disposición. A cada sistema piloto se le agrega medio de soporte constituido por grava (5 a 10 cm) y arena (15 a 20 cm), el sustrato es evaluado y caracterizado por su diámetro nominal, posterior en cada sistema se siembran las especies en un área de 3x3 y cada humedal por dos semanas se adecua bajo la solución de Hoagland y Arnon y régimen de humedad. En el agua sintética se analizaron los siguientes parámetros: pH, sólidos totales, sólidos suspendido totales, sólidos disueltos totales, demanda química de oxígeno (DQO), demanda bioquímica de oxígeno (DBO5), nitrógeno total (NTK) y fosforo total (PT). También se realizó la determinación del crecimiento de las plantas a partir del incremento de biomasa, porosidad de la raíz y de igual forma se determina NTK y PT. Los resultados demostraron que el sistema es una opción para la remoción de la carga orgánica y de nutrientes en aguas residuales de origen doméstico, de bajo costo de operación y mantenimiento, especialmente se observa que las plantas que crecen en sistemas de régimen de humedad ácuico y ústico, tienden a tener una mayor recepción y adaptación en los humedales artificiales pilotos, es el caso de la elefanta (Pennisetum purpureum), el cual presenta las más altas tasas de remoción de contaminantes y nutrientes en el afluente, seguido por el papiro japonés (Cyperus alternifolius) y el vetiver (Chrysopogon zizanioides), respecto a tasas de remoción. La remoción de contaminantes que se presentan más altos respectivamente, constituyen sólidos en primera instancia, seguido por la demanda bioquímica de oxigeno (DBO5), demanda química de oxígeno (DQO), nitrógeno total (NTK) y fósforo total (PT), estos últimos presentan una baja tasa de remoción, debido a la naturaleza misma del contaminante, a los organismos que realizan la remoción y absorción y al tiempo de retención que se elige, el cual influye en la tasa de remoción del contaminante siendo menor en la concentración de fósforo, pero se encuentranen el rango esperado para estos sistemas de tratamiento no convencionales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The assessment of building thermal performance is often carried out using HVAC energy consumption data, when available, or thermal comfort variables measurements, for free-running buildings. Both types of data can be determined by monitoring or computer simulation. The assessment based on thermal comfort variables is the most complex because it depends on the determination of the thermal comfort zone. For these reasons, this master thesis explores methods of building thermal performance assessment using variables of thermal comfort simulated by DesignBuilder software. The main objective is to contribute to the development of methods to support architectural decisions during the design process, and energy and sustainable rating systems. The research method consists on selecting thermal comfort methods, modeling them in electronic sheets with output charts developed to optimize the analyses, which are used to assess the simulation results of low cost house configurations. The house models consist in a base case, which are already built, and changes in thermal transmittance, absorptance, and shading. The simulation results are assessed using each thermal comfort method, to identify the sensitivity of them. The final results show the limitations of the methods, the importance of a method that considers thermal radiance and wind speed, and the contribution of the chart proposed

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A atuação eficaz dos trabalhadores da saúde no atendimento imediato da parada cardiorrespiratória possibilita o efetivo processo de implementação da hipotermia terapêutica, reduzindo possíveis danos cerebrais e proporcionando um melhor prognóstico para o paciente. O presente estudo objetivou conhecer o processo de implementação da hipotermia terapêutica pós-parada cardiorrespiratória em hospitais do extremo sul do Brasil. Tratou-se de uma pesquisa de abordagem qualitativa, do tipo descritiva. O cenário do estudo foram duas Unidades de Terapia Intensiva de dois hospitais onde a hipotermia terapêutica pós-parada cardiorrespiratória é realizada. Os sujeitos do estudo foram médicos, enfermeiros e técnicos de enfermagem atuantes nas referidas unidades. A coleta de dados foi composta por dois momentos. Primeiramente, foi desenvolvida uma pesquisa retrospectiva nos prontuários dos pacientes e, posteriormente, foram aplicadas entrevistas semiestruturadas por meio de roteiro de entrevista com os profissionais citados, as quais foram gravadas com aparelho digital. A coleta de dados ocorreu durante o mês de outubro de 2014. Para interpretação dos dados, foi utilizada a análise textual discursiva, construindo-se três categorias. Na primeira categoria, “processo de implementação da hipotermia terapêutica”, constatou-se que o hospital com uma implementação sistematizada e organizada utiliza um protocolo escrito e, em relação às fases de aplicação da hipotermia terapêutica, ambas as instituições utilizam os métodos tradicionais de indução, manutenção e reaquecimento. A segunda categoria, “facilidades e dificuldades vivenciadas pela equipe de saúde durante a aplicação da hipotermia terapêutica”, identifica a estrutura física, harmonia da equipe, equipamentos para a monitorização constante das condições hemodinâmicas dos pacientes e a otimização do tempo de trabalho como facilitadores. No que tange às dificuldades, constatou-se a aquisição de materiais, como o gelo e o BIS; disponibilidade de um único termômetro esofágico; inexistência de EPI’s; conhecimento insuficiente e inaptidão técnica; ausência de educação permanente e dimensionamento inadequado dos profissionais de enfermagem. Na terceira categoria, “efeitos adversos e complicações encontradas pela equipe de saúde durante a aplicação da hipotermia terapêutica e cuidados de enfermagem realizados”, verificou-se, como efeitos adversos, a ocorrência de tremores, bradicardia e hipotensão e de complicações como hipotermia excessiva e queimaduras de pele. Os cuidados de enfermagem direcionam-se aos cuidados com a pele e extremidades, uso do gelo, sedação, higiene, conforto e preparo de material para monitorização. Concluiu-se que a hipotermia terapêutica é possível de ser aplicada, na realidade das instituições pesquisadas, de maneira segura, eficaz e de baixo custo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tooth loss is a common result of a variety of oral diseases due to physiological causes, trauma, genetic disorders, and aging and can lead to physical and mental suffering that markedly lowers the individual’s quality of life. Tooth is a complex organ that is composed of mineralized tissues and soft connective tissues. Dentin is the most voluminous tissue of the tooth and its formation (dentinogenesis) is a highly regulated process displaying several similarities with osteogenesis. In this study, gelatin, thermally denatured collagen, was used as a promising low-cost material to develop scaffolds for hard tissue engineering. We synthetized dentin-like scaffolds using gelatin biomineralized with magnesium-doped hydroxyapatite and blended it with alginate. With a controlled freeze-drying process and alginate cross-linking, it is possible to obtain scaffolds with microscopic aligned channels suitable for tissue engineering. 3D cell culture with mesenchymal stem cells showed the promising properties of the new scaffolds for tooth regeneration. In detail, the chemical–physical features of the scaffolds, mimicking those of natural tissue, facilitate the cell adhesion, and the porosity is suitable for long-term cell colonization and fine cell–material interactions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The current energy market requires urgent revision for the introduction of renewable, less-polluting and inexpensive energy sources. Biohydrogen (bioH2) is considered to be one of the most appropriate options for this model shift, being easily produced through the anaerobic fermentation of carbohydrate-containing biomass. Ideally, the feedstock should be low-cost, widely available and convertible into a product of interest. Microalgae are considered to possess the referred properties, being also highly valued for their capability to assimilate CO2 [1]. The microalga Spirogyra sp. is able to accumulate high concentrations of intracellular starch, a preferential carbon source for some bioH2 producing bacteria such as Clostridium butyricum [2]. In the present work, Spirogyra biomass was submitted to acid hydrolysis to degrade polymeric components and increase the biomass fermentability. Initial tests of bioH2 production in 120 mL reactors with C. butyricum yielded a maximum volumetric productivity of 141 mL H2/L.h and a H2 production yield of 3.78 mol H2/mol consumed sugars. Subsequently, a sequential batch reactor (SBR) was used for the continuous H2 production from Spirogyra hydrolysate. After 3 consecutive batches, the fermentation achieved a maximum volumetric productivity of 324 mL H2/L.h, higher than most results obtained in similar production systems [3] and a potential H2 production yield of 10.4 L H2/L hydrolysate per day. The H2 yield achieved in the SBR was 2.59 mol H2/mol, a value that is comparable to those attained with several thermophilic microorganisms [3], [4]. In the present work, a detailed energy consumption of the microalgae value-chain is presented and compared with previous results from the literature. The specific energy requirements were determined and the functional unit considered was gH2 and MJH2. It was possible to identify the process stages responsible for the highest energy consumption during bioH2 production from Spirogyra biomass for further optimisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is devoted to the development, synthesis, properties, and applications of nano materials for critical technologies, including three areas: (1) Microbial contamination of drinking water is a serious problem of global significance. About 51% of the waterborne disease outbreaks in the United States can be attributed to contaminated ground water. Development of metal oxide nanoparticles, as viricidal materials is of technological and fundamental scientific importance. Nanoparticles with high surface areas and ultra small particle sizes have dramatically enhanced efficiency and capacity of virus inactivation, which cannot be achieved by their bulk counterparts. A series of metal oxide nanoparticles, such as iron oxide nanoparticles, zinc oxide nanoparticles and iron oxide-silver nanoparticles, coated on fiber substrates was developed in this research for evaluation of their viricidal activity. We also carried out XRD, TEM, SEM, XPS, surface area measurements, and zeta potential of these nanoparticles. MS2 virus inactivation experiments showed that these metal oxide nanoparticle coated fibers were extremely powerful viricidal materials. Results from this research suggest that zinc oxide nanoparticles with diameter of 3.5 nm, showing an isoelectric point (IEP) at 9.0, were well dispersed on fiberglass. These fibers offer an increase in capacity by orders of magnitude over all other materials. Compared to iron oxide nanoparticles, zinc oxide nanoparticles didn’t show an improvement in inactivation kinetics but inactivation capacities did increase by two orders of magnitude to 99.99%. Furthermore, zinc oxide nanoparticles have higher affinity to viruses than the iron oxide nanoparticles in presence of competing ions. The advantages of zinc oxide depend on high surface charge density, small nanoparticle sizes and capabilities of generating reactive oxygen species. The research at its present stage of development appears to offer the best avenue to remove viruses from water. Without additional chemicals and energy input, this system can be implemented by both points of use (POU) and large-scale use water treatment technology, which will have a significant impact on the water purification industry. (2) A new family of aliphatic polyester lubricants has been developed for use in micro-electromechanical systems (MEMS), specifically for hard disk drives that operate at high spindle speeds (>15000rpm). Our program was initiated to address current problems with spin-off of the perfluoroether (PFPE) lubricants. The new polyester lubricant appears to alleviate spin-off problems and at the same time improves the chemical and thermal stability. This new system provides a low cost alternative to PFPE along with improved adhesion to the substrates. In addition, it displays a much lower viscosity, which may be of importance to stiction related problems. The synthetic route is readily scalable in case additional interest emerges in other areas including small motors. (3) The demand for increased signal transmission speed and device density for the next generation of multilevel integrated circuits has placed stringent demands on materials performance. Currently, integration of the ultra low-k materials in dual Damascene processing requires chemical mechanical polishing (CMP) to planarize the copper. Unfortunately, none of the commercially proposed dielectric candidates display the desired mechanical and thermal properties for successful CMP. A new polydiacetylene thermosetting polymer (DEB-TEB), which displays a low dielectric constant (low-k) of 2.7, was recently developed. This novel material appears to offer the only avenue for designing an ultra low k dielectric (1.85k), which can still display the desired modulus (7.7Gpa) and hardness (2.0Gpa) sufficient to withstand the process of CMP. We focused on further characterization of the thermal properties of spin-on poly (DEB-TEB) ultra-thin film. These include the coefficient of thermal expansion (CTE), biaxial thermal stress, and thermal conductivity. Thus the CTE is 2.0*10-5K-1 in the perpendicular direction and 8.0*10-6 K-1 in the planar direction. The low CTE provides a better match to the Si substrate which minimizes interfacial stress and greatly enhances the reliability of the microprocessors. Initial experiments with oxygen plasma etching suggest a high probability of success for achieving vertical profiles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents two novel methods for treating important environmental contaminants from two different wastewater streams. One process utilizes the kinetic advantages and reliability of ion exchanging clinoptilolite in combination with biological treatment to remove ammonium from municipal sewage. A second process, HAMBgR (Hybrid Adsorption Membrane Biological Reactor), combines both ion exchange resin and bacteria into a single reactor to treat perchlorate contaminated waters. Combining physicochemical adsorptive treatment with biological treatment can provide synergistic benefits to the overall removal processes. Ion exchange removal solves some of the common operational reliability limitations of biological treatment, like slow response to environmental changes and leaching. Biological activity can in turn help reduce the economic and environmental challenges of ion exchange processes, like regenerant cost and brine disposal. The second section of this study presents continuous flow column experiments, used to demonstrate the ability of clinoptilolite to remove wastewater ammonium, as well as the effectiveness of salt regeneration using highly concentrated sea salt solutions. The working capacity of clinoptilolite more than doubled over the first few loading cycles, while regeneration recovered more than 98% of ammonium. Using the regenerant brine for subsequent halotolerant algae growth allowed for its repeated use, which could lead to cost savings and production of valuable algal biomass. The algae were able to uptake all ammonium in solution, and the brine was able to be used again with no loss in regeneration efficiency. This process has significant advantages over conventional biological nitrification; shorter retention times, wider range of operational conditions, and higher quality effluent free of nitrate. Also, since the clinoptilolite is continually regenerated and the regenerant is rejuvenated by algae, overall input costs are expected to be low. The third section of this study introduces the HAMBgR process for the elimination of perchlorate and presents batch isotherm experiments and pilot reactor tests. Results showed that a variety of ion-exchange resins can be effectively and repeatedly regenerated biologically, and maintain an acceptable working capacity. The presence of an adsorbent in the HAMBgR process improved bioreactor performance during operational fluctuations by providing a physicochemical backup to the biological process. Pilot reactor tests showed that the HAMBgR process reduced effluent perchlorate spikes by up to 97% in comparison to a conventional membrane bio-reactor (MBR) that was subject to sudden changes in influent conditions. Also, the HAMBgR process stimulated biological activity and lead to higher biomass concentrations during increased contaminant loading conditions. Conventional MBR systems can be converted into HAMBgR’s at a low cost, easily justifiable by the realized benefits. The concepts employed in the HAMBgR process can be adapted to treat other target contaminants, not just perchlorate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to power our planet for the next century, clean energy technologies need to be developed and deployed. Photovoltaic solar cells, which convert sunlight into electricity, are a clear option; however, they currently supply 0.1% of the US electricity due to the relatively high cost per Watt of generation. Thus, our goal is to create more power from a photovoltaic device, while simultaneously reducing its price. To accomplish this goal, we are creating new high efficiency anti-reflection coatings that allow more of the incident sunlight to be converted to electricity, using simple and inexpensive coating techniques that enable reduced manufacturing costs. Traditional anti-reflection coatings (consisting of thin layers of non-absorbing materials) rely on the destructive interference of the reflected light, causing more light to enter the device and subsequently get absorbed. While these coatings are used on nearly all commercial cells, they are wavelength dependent and are deposited using expensive processes that require elevated temperatures, which increase production cost and can be detrimental to some temperature sensitive solar cell materials. We are developing two new classes of anti-reflection coatings (ARCs) based on textured dielectric materials: (i) a transparent, flexible paper technology that relies on optical scattering and reduced refractive index contrast between the air and semiconductor and (ii) silicon dioxide (SiO2) nanosphere arrays that rely on collective optical resonances. Both techniques improve solar cell absorption and ultimately yield high efficiency, low cost devices. For the transparent paper-based ARCs, we have recently shown that they improve solar cell efficiencies for all angles of incident illumination reducing the need for costly tracking of the sun’s position. For a GaAs solar cell, we achieved a 24% improvement in the power conversion efficiency using this simple coating. Because the transparent paper is made from an earth abundant material (wood pulp) using an easy, inexpensive and scalable process, this type of ARC is an excellent candidate for future solar technologies. The coatings based on arrays of dielectric nanospheres also show excellent potential for inexpensive, high efficiency solar cells. The fabrication process is based on a Meyer rod rolling technique, which can be performed at room-temperature and applied to mass production, yielding a scalable and inexpensive manufacturing process. The deposited monolayer of SiO2 nanospheres, having a diameter of 500 nm on a bare Si wafer, leads to a significant increase in light absorption and a higher expected current density based on initial simulations, on the order of 15-20%. With application on a Si solar cell containing a traditional anti-reflection coating (Si3N4 thin-film), an additional increase in the spectral current density is observed, 5% beyond what a typical commercial device would achieve. Due to the coupling between the spheres originated from Whispering Gallery Modes (WGMs) inside each nanosphere, the incident light is strongly coupled into the high-index absorbing material, leading to increased light absorption. Furthermore, the SiO2 nanospheres scatter and diffract light in such a way that both the optical and electrical properties of the device have little dependence on incident angle, eliminating the need for solar tracking. Because the layer can be made with an easy, inexpensive, and scalable process, this anti-reflection coating is also an excellent candidate for replacing conventional technologies relying on complicated and expensive processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Com uma Sociedade cada vez mais complexa e evolutiva, vários são os desafios a que somos postos à prova. Com isto, a população apesar das suas dependências e comodismos, tende a procurar novos meios de lazer que se envolvam com a natureza e com a prática desportiva. Deste modo, o Cicloturismo vai ao encontro destas práticas, tornando-se um mercado em evolução, cuja a aderência do número de pessoas tem vindo a crescer, sendo um tipo de turismo que acarreta resultados positivos, tais como um estilo de vida saudável, bem como a utilização de um meio de transporte de baixo custo e com uma pegada ecológica. Na mesma linha de pensamento, o presente documento aborda todo o processo de execução de um Quadriciclo. O objetivo geral do mesmo, incide em satisfazer as necessidades dos praticantes de cicloturismo e incentivar as pessoas para o uso da bicicleta como meio de transporte. Para tal, os participantes envolvidos foram utilizadores assíduos do uso da bicicleta, não só procurando-a para o lazer, mas sim para a prática do Cicloturismo. Assim, os utilizadores do veiculo projetado têm o privilégio de ter como painel de fundo das suas viagens, o coração da natureza, disfrutando do conforto e da segurança que este proporciona, e ainda como resultado das mesmas, um estilo de vida mais saudável. De forma sequencial e organizada, utilizando como base a Metodologia de Ulrich e Eppinger, são notáveis os passos que foram dados para obter o produto final. Para além dos fortes conceitos de conforto, estabilidade, engenharia e ergonomia, é primordial salientar toda a importância do Design, uma vez que influenciou a disposição do veículo de quatro rodas, do início ao fim.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the biggest environmental problems of the population is the lack of sewage treatment, especially in rural communities and low-income. The development of technologies for efficient, low-cost sanitation need to be developed to meet the disadvantaged people of this basic service. This work was the implementation proposal of a technology called constructed wetlands, also known as Wastewater Treatment Plant for Roots Zone - ETEZR. The objective was to develop a non- formal environmental education proposal for redevelopment, using outreach methods for residents and deployment of this technology ETEZR in the rural community of Cologne Grebe in Sao Jose dos Pinhais - PR. With technical support from the Paranaense Technical Assistance and Rural Extension Institute -EMATER and the Federal Technological University of Paraná - UTFPR, 5 ETEZR were deployed in the colony through three theoretical and practical workshops, which involved total 67 people from the community 5 technicians EMATER and 13 of the Municipal Town Hall. Após4 months of implementation were carried out two collections of raw wastewater and treated to analyze physical, chemical and biological parameters. The results evaluated by chemical parameters BOD, COD, phosphorus, ammonia nitrogen comparing raw and treated sewage, demonstrate that ETEZR are effective in the treatment of sewage. 5 Seasons minimum and maximum efficiency between the basic parameters analyzed were 52.2 to 95.5% for BOD; 47 to 94.5% for COD; 21.5 to 96% phosphorus; 30-98% for ammonia nitrogen. Oils and greases, and a series of solid also achieved a significant reduction in their values when comparing the raw sewage and treated sewage, and biological parameters evaluated by means of coliforms showed a reduction of 80 to 99%. With the implementation of environmental education process aimed sanitation was possible to evaluate the perception of the population to accept the environmental sanitation technology using the ETEZR, understand the needs and sanitation concepts for the community. This research evaluated the development of the methodology applied by the non-formal environmental education in order to provide subsidies for rural sanitation plan process for the municipality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The design demands on water and sanitation engineers are rapidly changing. The global population is set to rise from 7 billion to 10 billion by 2083. Urbanisation in developing regions is increasing at such a rate that a predicted 56% of the global population will live in an urban setting by 2025. Compounding these problems, the global water and energy crises are impacting the Global North and South alike. High-rate anaerobic digestion offers a low-cost, low-energy treatment alternative to the energy intensive aerobic technologies used today. Widespread implementation however is hindered by the lack of capacity to engineer high-rate anaerobic digestion for the treatment of complex wastes such as sewage. This thesis utilises the Expanded Granular Sludge Bed bioreactor (EGSB) as a model system in which to study the ecology, physiology and performance of high-rate anaerobic digestion of complex wastes. The impacts of a range of engineered parameters including reactor geometry, wastewater type, operating temperature and organic loading rate are systematically investigated using lab-scale EGSB bioreactors. Next generation sequencing of 16S amplicons is utilised as a means of monitoring microbial ecology. Microbial community physiology is monitored by means of specific methanogenic activity testing and a range of physical and chemical methods are applied to assess reactor performance. Finally, the limit state approach is trialled as a method for testing the EGSB and is proposed as a standard method for biotechnology testing enabling improved process control at full-scale. The arising data is assessed both qualitatively and quantitatively. Lab-scale reactor design is demonstrated to significantly influence the spatial distribution of the underlying ecology and community physiology in lab-scale reactors, a vital finding for both researchers and full-scale plant operators responsible for monitoring EGSB reactors. Recurrent trends in the data indicate that hydrogenotrophic methanogenesis dominates in high-rate anaerobic digestion at both full- and lab-scale when subject to engineered or operational stresses including low-temperature and variable feeding regimes. This is of relevance for those seeking to define new directions in fundamental understanding of syntrophic and competitive relations in methanogenic communities and also to design engineers in determining operating parameters for full-scale digesters. The adoption of the limit state approach enabled identification of biological indicators providing early warning of failure under high-solids loading, a vital insight for those currently working empirically towards the development of new biotechnologies at lab-scale.