859 resultados para Point-of-care systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light microscopy has been one of the most common tools in biological research, because of its high resolution and non-invasive nature of the light. Due to its high sensitivity and specificity, fluorescence is one of the most important readout modes of light microscopy. This thesis presents two new fluorescence microscopic imaging techniques: fluorescence optofluidic microscopy and fluorescent Talbot microscopy. The designs of the two systems are fundamentally different from conventional microscopy, which makes compact and portable devices possible. The components of the devices are suitable for mass-production, making the microscopic imaging system more affordable for biological research and clinical diagnostics.

Fluorescence optofluidic microscopy (FOFM) is capable of imaging fluorescent samples in fluid media. The FOFM employs an array of Fresnel zone plates (FZP) to generate an array of focused light spots within a microfluidic channel. As a sample flows through the channel and across the array of focused light spots, a filter-coated CMOS sensor collects the fluorescence emissions. The collected data can then be processed to render a fluorescence microscopic image. The resolution, which is determined by the focused light spot size, is experimentally measured to be 0.65 μm.

Fluorescence Talbot microscopy (FTM) is a fluorescence chip-scale microscopy technique that enables large field-of-view (FOV) and high-resolution imaging. The FTM method utilizes the Talbot effect to project a grid of focused excitation light spots onto the sample. The sample is placed on a filter-coated CMOS sensor chip. The fluorescence emissions associated with each focal spot are collected by the sensor chip and are composed into a sparsely sampled fluorescence image. By raster scanning the Talbot focal spot grid across the sample and collecting a sequence of sparse images, a filled-in high-resolution fluorescence image can be reconstructed. In contrast to a conventional microscope, a collection efficiency, resolution, and FOV are not tied to each other for this technique. The FOV of FTM is directly scalable. Our FTM prototype has demonstrated a resolution of 1.2 μm, and the collection efficiency equivalent to a conventional microscope objective with a 0.70 N.A. The FOV is 3.9 mm × 3.5 mm, which is 100 times larger than that of a 20X/0.40 N.A. conventional microscope objective. Due to its large FOV, high collection efficiency, compactness, and its potential for integration with other on-chip devices, FTM is suitable for diverse applications, such as point-of-care diagnostics, large-scale functional screens, and long-term automated imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the complexity of biological networks, we find that certain common architectures govern network structures. These architectures impose fundamental constraints on system performance and create tradeoffs that the system must balance in the face of uncertainty in the environment. This means that while a system may be optimized for a specific function through evolution, the optimal achievable state must follow these constraints. One such constraining architecture is autocatalysis, as seen in many biological networks including glycolysis and ribosomal protein synthesis. Using a minimal model, we show that ATP autocatalysis in glycolysis imposes stability and performance constraints and that the experimentally well-studied glycolytic oscillations are in fact a consequence of a tradeoff between error minimization and stability. We also show that additional complexity in the network results in increased robustness. Ribosome synthesis is also autocatalytic where ribosomes must be used to make more ribosomal proteins. When ribosomes have higher protein content, the autocatalysis is increased. We show that this autocatalysis destabilizes the system, slows down response, and also constrains the system’s performance. On a larger scale, transcriptional regulation of whole organisms also follows architectural constraints and this can be seen in the differences between bacterial and yeast transcription networks. We show that the degree distributions of bacterial transcription network follow a power law distribution while the yeast network follows an exponential distribution. We then explored the evolutionary models that have previously been proposed and show that neither the preferential linking model nor the duplication-divergence model of network evolution generates the power-law, hierarchical structure found in bacteria. However, in real biological systems, the generation of new nodes occurs through both duplication and horizontal gene transfers, and we show that a biologically reasonable combination of the two mechanisms generates the desired network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past many different methodologies have been devised to support software development and different sets of methodologies have been developed to support the analysis of software artefacts. We have identified this mismatch as one of the causes of the poor reliability of embedded systems software. The issue with software development styles is that they are ``analysis-agnostic.'' They do not try to structure the code in a way that lends itself to analysis. The analysis is usually applied post-mortem after the software was developed and it requires a large amount of effort. The issue with software analysis methodologies is that they do not exploit available information about the system being analyzed.

In this thesis we address the above issues by developing a new methodology, called "analysis-aware" design, that links software development styles with the capabilities of analysis tools. This methodology forms the basis of a framework for interactive software development. The framework consists of an executable specification language and a set of analysis tools based on static analysis, testing, and model checking. The language enforces an analysis-friendly code structure and offers primitives that allow users to implement their own testers and model checkers directly in the language. We introduce a new approach to static analysis that takes advantage of the capabilities of a rule-based engine. We have applied the analysis-aware methodology to the development of a smart home application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is motivated by safety-critical applications involving autonomous air, ground, and space vehicles carrying out complex tasks in uncertain and adversarial environments. We use temporal logic as a language to formally specify complex tasks and system properties. Temporal logic specifications generalize the classical notions of stability and reachability that are studied in the control and hybrid systems communities. Given a system model and a formal task specification, the goal is to automatically synthesize a control policy for the system that ensures that the system satisfies the specification. This thesis presents novel control policy synthesis algorithms for optimal and robust control of dynamical systems with temporal logic specifications. Furthermore, it introduces algorithms that are efficient and extend to high-dimensional dynamical systems.

The first contribution of this thesis is the generalization of a classical linear temporal logic (LTL) control synthesis approach to optimal and robust control. We show how we can extend automata-based synthesis techniques for discrete abstractions of dynamical systems to create optimal and robust controllers that are guaranteed to satisfy an LTL specification. Such optimal and robust controllers can be computed at little extra computational cost compared to computing a feasible controller.

The second contribution of this thesis addresses the scalability of control synthesis with LTL specifications. A major limitation of the standard automaton-based approach for control with LTL specifications is that the automaton might be doubly-exponential in the size of the LTL specification. We introduce a fragment of LTL for which one can compute feasible control policies in time polynomial in the size of the system and specification. Additionally, we show how to compute optimal control policies for a variety of cost functions, and identify interesting cases when this can be done in polynomial time. These techniques are particularly relevant for online control, as one can guarantee that a feasible solution can be found quickly, and then iteratively improve on the quality as time permits.

The final contribution of this thesis is a set of algorithms for computing feasible trajectories for high-dimensional, nonlinear systems with LTL specifications. These algorithms avoid a potentially computationally-expensive process of computing a discrete abstraction, and instead compute directly on the system's continuous state space. The first method uses an automaton representing the specification to directly encode a series of constrained-reachability subproblems, which can be solved in a modular fashion by using standard techniques. The second method encodes an LTL formula as mixed-integer linear programming constraints on the dynamical system. We demonstrate these approaches with numerical experiments on temporal logic motion planning problems with high-dimensional (10+ states) continuous systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the dynamic response of a General multidegree-of-freedom linear system with a one dimensional nonlinear constraint attached between two points. The nonlinear constraint is assumed to consist of rate-independent conservative and hysteretic nonlinearities and may contain a viscous dissipation element. The dynamic equations for general spatial and temporal load distributions are derived for both continuous and discrete systems. The method of equivalent linearization is used to develop equations which govern the approximate steady-state response to generally distributed loads with harmonic time dependence.

The qualitative response behavior of a class of undamped chainlike structures with a nonlinear terminal constraint is investigated. It is shown that the hardening or softening behavior of every resonance curve is similar and is determined by the properties of the constraint. Also examined are the number and location of resonance curves, the boundedness of the forced response, the loci of response extrema, and other characteristics of the response. Particular consideration is given to the dependence of the response characteristics on the properties of the linear system, the nonlinear constraint, and the load distribution.

Numerical examples of the approximate steady-state response of three structural systems are presented. These examples illustrate the application of the formulation and qualitative theory. It is shown that disconnected response curves and response curves which cross are obtained for base excitation of a uniform shear beam with a cubic spring foundation. Disconnected response curves are also obtained for the steady-state response to a concentrated load of a chainlike structure with a hardening hysteretic constraint. The accuracy of the approximate response curves is investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in sustainability of aquatic ecosystems are likely to be brought about by the global warming that has been widely predicted. In this article, the effects of water temperature on water-bodies (lakes, oceans and rivers) are reviewed followed by the effects of temperature on aquatic organisms. Almost all aquatic organisms require exogenous heat before they can metabolise efficiently. An organism that is adapted to warm temperatures will have a higher rate of metabolism of food organisms and this increases feeding rate. In addition, an increase in temperature raises the metabolism of food organisms, so food quality can be altered. Where populations have a different tolerance to temperature the result is habitat partitioning. One effect of prolonged high temperature is that it causes water to evaporate readily. In the marine littoral this is not an important problem as tides will replenish water in pools. Small rain pools are found in many tropical countries during the rainy season and these become incompletely dried at intervals. The biota of such pools must have resistant stages within the life cycle that enable them to cope with periods of drying. The most important potential effects of global warming include (i) the alteration of existing coastlines, (ii) the development of more deserts on some land masses, (iii) higher productivity producing higher crop production but a greater threat of algal blooms and (iv) the processing of organic matter at surface microlayers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article is intended to open a discussion about the historical development of lakes Zirahuen, Patzcuaro and Cuitzeo in the state of Michoacan, and the postulated relationships between lake ecology and evolution. Dr Fernando De Buen was the first man dedicated to limnology in Mexico who came to the country in the 1930s. He was adviser at the Estacion Limnologica de Patzcuaro and wrote outstanding papers dealing with Mexican lakes. The lakes of Michoacan probably formed in the late Pliocene or Holocene, and were part of a tributary to the Lerma River, which became isolated by successive volanic barriers to form lake basins. Lake Zirahuen is a warm monomictic waterbody with unique water dynamics amongst the Michoacan lakes. Because it is relatively deep (max depth 40m), seasonal patterns of alternating circulation and thermal stratification develop in the lake, a feature not shared by the other two polymictic shallow lakes, Patzcuaro and Cuitzeo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major part of the support for fundamental research on aquatic ecosystems continues to be provided by the Natural Environment Research Council (NERC). Funds are released for ”thematic” studies in a selected special topic or programme. ”Testable Models of Aquatic Ecosystems” was a Special Topic of the NERC, initiated in 1995, the aim of which was to promote ecological modelling by making new links between experimental aquatic biologists and state-of-the-art modellers. The Topic covered both marine and freshwater systems. This paper summarises projects on aspects of the responses of individual organisms to the effects of environmental variability, on the assembly, permanence and resilience of communities, and on aspects of spatial models. The authors conclude that the NERC Special Topic has been highly successful in promoting the development and application of models, most particularly through the interplay between experimental ecologists and formal modellers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid rise in the residential photo voltaic (PV) adoptions in the past half decade has created a need in the electricity industry for a widely-accessible model that estimates PV adoption based on a combination of different business and policy decisions. This work analyzes historical adoption patterns and finds fiscal savings to be the single most important factor in PV adoption, with significantly greater predictive power compared to all other socioeconomic factors including income and education. We can create an application available on Google App Engine (GAE) based on our findings that allows all stakeholders including policymakers, power system researchers and regulators to study the complex and coupled relationship between PV adoption, utility economics and grid sustainability. The application allows users to experiment with different customer demographics, tier structures and subsidies, hence allowing them to tailor the application to the geographic region they are studying. This study then demonstrates the different type of analyses possible with the application by studying the relative impact of different policies regarding tier structures, fixed charges and PV prices on PV adoption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computation technology has dramatically changed the world around us; you can hardly find an area where cell phones have not saturated the market, yet there is a significant lack of breakthroughs in the development to integrate the computer with biological environments. This is largely the result of the incompatibility of the materials used in both environments; biological environments and experiments tend to need aqueous environments. To help aid in these development chemists, engineers, physicists and biologists have begun to develop microfluidics to help bridge this divide. Unfortunately, the microfluidic devices required large external support equipment to run the device. This thesis presents a series of several microfluidic methods that can help integrate engineering and biology by exploiting nanotechnology to help push the field of microfluidics back to its intended purpose, small integrated biological and electrical devices. I demonstrate this goal by developing different methods and devices to (1) separate membrane bound proteins with the use of microfluidics, (2) use optical technology to make fiber optic cables into protein sensors, (3) generate new fluidic devices using semiconductor material to manipulate single cells, and (4) develop a new genetic microfluidic based diagnostic assay that works with current PCR methodology to provide faster and cheaper results. All of these methods and systems can be used as components to build a self-contained biomedical device.