939 resultados para Continuously Stirred Bioreactor
Resumo:
The Maynard-Burgess House was excavated by Archaeology in Annapolis from Fall, 1990 to Summer, 1992. The still-standing house is located at 163 Duke of Gloucester Street in Annapolis' Historic District and is today being restored by Port of Annapolis, Incorporated. Archaeological testing and excavation of the site was developed alongside architectural analyses and archival research as the initial phase of the home's restoration. The Maynard-Burgess House was continuously occupied by two African-American families, the Maynards and the Burgesses, from the 1850s until the late 1980s. The main block of the house was built between 1850 and 1858 by the household of John T. Maynard, a free African American born in 1810,and his wife Maria Spencer Maynard. Maynard descendants lived in the home until it was foreclosed in 1908 and subsequently sold to the family of Willis and Mary Burgess in 1915. Willis had been a boarder in the home in 1880, and his sister Martha Ready had married John and Maria's son John Henry. Burgess descendants lived at the home until its sale in 1990.
Resumo:
The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.
Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.
In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.
For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of
Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of
In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.
Finally, for an industrial application, the use of phages to inhibit invasive
In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.
Resumo:
We demonstrate that when the future path of the discount rate is uncertain and highly correlated, the distant future should be discounted at significantly lower rates than suggested by the current rate. We then use two centuries of US interest rate data to quantify this effect. Using both random walk and mean-reverting models, we compute the "certainty-equivalent rate" that summarizes the effect of uncertainty and measures the appropriate forward rate of discount in the future. Under the random walk model we find that the certainty-equivalent rate falls continuously from 4% to 2% after 100 years, 1% after 200 years, and 0.5% after 300 years. At horizons of 400 years, the discounted value increases by a factor of over 40,000 relative to conventional discounting. Applied to climate change mitigation, we find that incorporating discount rate uncertainty almost doubles the expected present value of mitigation benefits. © 2003 Elsevier Science (USA). All rights reserved.
Resumo:
© 2015, Institute of Mathematical Statistics. All rights reserved.In order to use persistence diagrams as a true statistical tool, it would be very useful to have a good notion of mean and variance for a set of diagrams. In [23], Mileyko and his collaborators made the first study of the properties of the Fréchet mean in (D
Resumo:
The caudal dentate nucleus (DN) in lateral cerebellum is connected with two visual/oculomotor areas of the cerebrum: the frontal eye field and lateral intraparietal cortex. Many neurons in frontal eye field and lateral intraparietal cortex produce "delay activity" between stimulus and response that correlates with processes such as motor planning. Our hypothesis was that caudal DN neurons would have prominent delay activity as well. From lesion studies, we predicted that this activity would be related to self-timing, i.e., the triggering of saccades based on the internal monitoring of time. We recorded from neurons in the caudal DN of monkeys (Macaca mulatta) that made delayed saccades with or without a self-timing requirement. Most (84%) of the caudal DN neurons had delay activity. These neurons conveyed at least three types of information. First, their activity was often correlated, trial by trial, with saccade initiation. Correlations were found more frequently in a task that required self-timing of saccades (53% of neurons) than in a task that did not (27% of neurons). Second, the delay activity was often tuned for saccade direction (in 65% of neurons). This tuning emerged continuously during a trial. Third, the time course of delay activity associated with self-timed saccades differed significantly from that associated with visually guided saccades (in 71% of neurons). A minority of neurons had sensory-related activity. None had presaccadic bursts, in contrast to DN neurons recorded more rostrally. We conclude that caudal DN neurons convey saccade-related delay activity that may contribute to the motor preparation of when and where to move.
Resumo:
Inflammation and the formation of an avascular fibrous capsule have been identified as the key factors controlling the wound healing associated failure of implantable glucose sensors. Our aim is to guide advantageous tissue remodeling around implanted sensor leads by the temporal release of dexamethasone (Dex), a potent anti-inflammatory agent, in combination with the presentation of a stable textured surface.
First, Dex-releasing polyurethane porous coatings of controlled pore size and thickness were fabricated using salt-leaching/gas-foaming technique. Porosity, pore size, thickness, drug release kinetics, drug loading amount, and drug bioactivity were evaluated. In vitro sensor functionality test were performed to determine if Dex-releasing porous coatings interfered with sensor performance (increased signal attenuation and/or response times) compared to bare sensors. Drug release from coatings monitored over two weeks presented an initial fast release followed by a slower release. Total release from coatings was highly dependent on initial drug loading amount. Functional in vitro testing of glucose sensors deployed with porous coatings against glucose standards demonstrated that highly porous coatings minimally affected signal strength and response rate. Bioactivity of the released drug was determined by monitoring Dex-mediated, dose-dependent apoptosis of human peripheral blood derived monocytes in culture.
The tissue modifying effects of Dex-releasing porous coatings were accessed by fully implanting Tygon® tubing in the subcutaneous space of healthy and diabetic rats. Based on encouraging results from these studies, we deployed Dex-releasing porous coatings from the tips of functional sensors in both diabetic and healthy rats. We evaluated if the tissue modifying effects translated into accurate, maintainable and reliable sensor signals in the long-term. Sensor functionality was accessed by continuously monitoring glucose levels and performing acute glucose challenges at specified time points.
Sensors treated with porous Dex-releasing coatings showed diminished inflammation and enhanced vascularization of the tissue surrounding the implants in healthy rats. Functional sensors with Dex-releasing porous coatings showed enhanced sensor sensitivity over a 21-day period when compared to controls. Enhanced sensor sensitivity was accompanied with an increase in sensor signal lag and MARD score. These results indicated that Dex-loaded porous coatings were able to elicit a favorable tissue response, and that such tissue microenvironment could be conducive towards extending the performance window of glucose sensors in vivo.
The diabetic pilot animal study showed differences in wound healing patters between healthy and diabetic subjects. Diabetic rats showed lower levels of inflammation and vascularization of the tissue surrounding implants when compared to their healthy counterparts. Also, functional sensors treated with Dex-releasing porous coatings did not show enhanced sensor sensitivity over a 21-day period. Moreover, increased in sensor signal lag and MARD scores were present in porous coated sensors regardless of Dex-loading when compared to bare implants. These results suggest that the altered wound healing patterns presented in diabetic tissues may lead to premature sensor failure when compared to sensors implanted in healthy rats.
Resumo:
BACKGROUND/AIMS: The obesity epidemic has spread to young adults, and obesity is a significant risk factor for cardiovascular disease. The prominence and increasing functionality of mobile phones may provide an opportunity to deliver longitudinal and scalable weight management interventions in young adults. The aim of this article is to describe the design and development of the intervention tested in the Cell Phone Intervention for You study and to highlight the importance of adaptive intervention design that made it possible. The Cell Phone Intervention for You study was a National Heart, Lung, and Blood Institute-sponsored, controlled, 24-month randomized clinical trial comparing two active interventions to a usual-care control group. Participants were 365 overweight or obese (body mass index≥25 kg/m2) young adults. METHODS: Both active interventions were designed based on social cognitive theory and incorporated techniques for behavioral self-management and motivational enhancement. Initial intervention development occurred during a 1-year formative phase utilizing focus groups and iterative, participatory design. During the intervention testing, adaptive intervention design, where an intervention is updated or extended throughout a trial while assuring the delivery of exactly the same intervention to each cohort, was employed. The adaptive intervention design strategy distributed technical work and allowed introduction of novel components in phases intended to help promote and sustain participant engagement. Adaptive intervention design was made possible by exploiting the mobile phone's remote data capabilities so that adoption of particular application components could be continuously monitored and components subsequently added or updated remotely. RESULTS: The cell phone intervention was delivered almost entirely via cell phone and was always-present, proactive, and interactive-providing passive and active reminders, frequent opportunities for knowledge dissemination, and multiple tools for self-tracking and receiving tailored feedback. The intervention changed over 2 years to promote and sustain engagement. The personal coaching intervention, alternatively, was primarily personal coaching with trained coaches based on a proven intervention, enhanced with a mobile application, but where all interactions with the technology were participant-initiated. CONCLUSION: The complexity and length of the technology-based randomized clinical trial created challenges in engagement and technology adaptation, which were generally discovered using novel remote monitoring technology and addressed using the adaptive intervention design. Investigators should plan to develop tools and procedures that explicitly support continuous remote monitoring of interventions to support adaptive intervention design in long-term, technology-based studies, as well as developing the interventions themselves.
Resumo:
As we look around a scene, we perceive it as continuous and stable even though each saccadic eye movement changes the visual input to the retinas. How the brain achieves this perceptual stabilization is unknown, but a major hypothesis is that it relies on presaccadic remapping, a process in which neurons shift their visual sensitivity to a new location in the scene just before each saccade. This hypothesis is difficult to test in vivo because complete, selective inactivation of remapping is currently intractable. We tested it in silico with a hierarchical, sheet-based neural network model of the visual and oculomotor system. The model generated saccadic commands to move a video camera abruptly. Visual input from the camera and internal copies of the saccadic movement commands, or corollary discharge, converged at a map-level simulation of the frontal eye field (FEF), a primate brain area known to receive such inputs. FEF output was combined with eye position signals to yield a suitable coordinate frame for guiding arm movements of a robot. Our operational definition of perceptual stability was "useful stability,” quantified as continuously accurate pointing to a visual object despite camera saccades. During training, the emergence of useful stability was correlated tightly with the emergence of presaccadic remapping in the FEF. Remapping depended on corollary discharge but its timing was synchronized to the updating of eye position. When coupled to predictive eye position signals, remapping served to stabilize the target representation for continuously accurate pointing. Graded inactivations of pathways in the model replicated, and helped to interpret, previous in vivo experiments. The results support the hypothesis that visual stability requires presaccadic remapping, provide explanations for the function and timing of remapping, and offer testable hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate frame transformations and quick actions despite visual afferent lags. With visual remapping in place for behavior, it may be exploited for perceptual continuity.
Resumo:
The factors that are driving the development and use of grids and grid computing, such as size, dynamic features, distribution and heterogeneity, are also pushing to the forefront service quality issues. These include performance, reliability and security. Although grid middleware can address some of these issues on a wider scale, it has also become imperative to ensure adequate service provision at local level. Load sharing in clusters can contribute to the provision of a high quality service, by exploiting both static and dynamic information. This paper is concerned with the presentation of a load sharing scheme, that can satisfy grid computing requirements. It follows a proactive, non preemptive and distributed approach. Load information is gathered continuously before it is needed, and a task is allocated to the most appropriate node for execution. Performance and reliability are enhanced by the decentralised nature of the scheme and the symmetric roles of the nodes. In addition, the scheme exhibits transparency characteristics that facilitate integration with the grid.
Resumo:
We study a two-machine open shop scheduling problem, in which the machines are not continuously available for processing. No preemption is allowed in the processing of any operation. The objective is to minimize the makespan. We consider approximability issues of the problem with more than one non-availability intervals and present an approximation algorithm with a worst-case ratio of 4/3 for the problem with a single non-availability interval.
Resumo:
This work comprises accurate computational analysis of levitated liquid droplet oscillations in AC and DC magnetic fields. The AC magnetic field interacting with the induced electric current within the liquid metal droplet generates intense fluid flow and the coupled free surface oscillations. The pseudo-spectral technique is used to solve the turbulent fluid flow equations for the continuously dynamically transformed axisymmetric fluid volume. The volume electromagnetic force distribution is updated with the shape and position change. We start with the ideal fluid test case for undamped Rayleigh frequency oscillations in the absence of gravity, and then add the viscous and the DC magnetic field damping. The oscillation frequency spectra are further analysed for droplets levitated against gravity in AC and DC magnetic fields at various combinations. In the extreme case electrically poorly conducting, diamagnetic droplet (water) levitation dynamics are simulated. Applications are aimed at pure electromagnetic material processing techniques and the material properties measurements in uncontaminated conditions.
Resumo:
This work presents computation analysis of levitated liquid thermal and flow fields with free surface oscillations in AC and DC magnetic fields. The volume electromagnetic force distribution is continuously updated with the shape and position change. The oscillation frequency spectra are analysed for droplets levitation against gravity in AC and DC magnetic fields at various combinations. For larger volume liquid metal confinement and melting the semi-levitation induction skull melting process is simulated with the same numerical model. Applications are aimed at pure electromagnetic material processing techniques and the material properties measurements in uncontaminated conditions.
Resumo:
Anisotropic conductive film (ACF) which consists of an adhesive epoxy matrix and randomly distributed conductive particles are widely used as the connection material for electronic devices with high I/O counts. However, for the semiconductor industry the reliability of the ACF is still a major concern due to a lack of experimental reliability data. This paper reports the investigations into the moisture-induced failures in Flip-Chip-on-Flex interconnections with Anisotropic Conductive Films (ACFs). Both experimental and modeling methods were applied. In the experiments, the contact resistance was used as a quality indicator and was measured continuously during the accelerated tests (autoclave tests). The temperature, relative humidity and the pressure were set at 121°C, 100%RH, and 2atm respectively. The contact resistance of the ACF joints increased during the tests and nearly 25% of the joints were found to be open after 168 hours’ testing time. Visible conduction gaps between the adhesive and substrate pads were observed. Cracks at the adhesive/flex interface were also found. For a better understanding of the experimental results, 3-D Finite Element (FE) models were built and a macro-micro modeling method was used to determine the moisture diffusion and moisture-induced stresses inside the ACF joints. Modeling results are consistent with the findings in the experimental work.
Resumo:
This paper reports the investigations into the moisture induced failures in flip-chip-on-flex interconnections with anisotropic conductive films (ACF). Both experimental and modeling methods were applied. In the experiments, the contact resistance was used as a quality indicator and was measured continuously during the accelerated tests (autoclave tests). The temperature, relative humidity and the pressure were set at 121°C, 100%RH, 1atm respectively. The contact resistance of the ACF joints increased during the tests and nearly 25% of the joints were found to be open after 168 hours' testing time. Visible conduction gaps between the adhesive and substrate pads were observed. Cracks at the adhesive/flex interface were also found. It is believed that the swelling effect of the adhesive and the water penetration along the adhesive/flex interface are the main causes of this contact degradation. Another finding from the experimental work was that the ACF interconnections that had undergone the reflow treatment were more sensitive to the moisture and showed worse reliability during the tests. For a better understanding of the experimental results, 3D finite element (FE) models were built and a macro-micro modeling method was used to determine the moisture diffusion and moisture-induced stresses inside the ACF joints. Modeling results are consistent with the findings in the experimental work.
Resumo:
This paper presents an investigation into dynamic self-adjustment of task deployment and other aspects of self-management, through the embedding of multiple policies. Non-dedicated loosely-coupled computing environments, such as clusters and grids are increasingly popular platforms for parallel processing. These abundant systems are highly dynamic environments in which many sources of variability affect the run-time efficiency of tasks. The dynamism is exacerbated by the incorporation of mobile devices and wireless communication. This paper proposes an adaptive strategy for the flexible run-time deployment of tasks; to continuously maintain efficiency despite the environmental variability. The strategy centres on policy-based scheduling which is informed by contextual and environmental inputs such as variance in the round-trip communication time between a client and its workers and the effective processing performance of each worker. A self-management framework has been implemented for evaluation purposes. The framework integrates several policy-controlled, adaptive services with the application code, enabling the run-time behaviour to be adapted to contextual and environmental conditions. Using this framework, an exemplar self-managing parallel application is implemented and used to investigate the extent of the benefits of the strategy