937 resultados para lab-on-a-chip systems
Resumo:
Based on a Belief-Action-Outcome framework, we produced a model that shows senior managers' perception of both the antecedents to and the consequences of Green IS adoption by a firm. This conceptual model and its associated hypotheses were empirically tested using a dataset generated from a survey of 405 organizations. The results suggest that coercive pressure influences the attitude toward Green IS adoption while mimetic pressure does not. In addition, we found that there was a significant relationship between Green IS adoption, attitude, and consideration of future consequences. Finally, we found that only long term Green IS adoption was positively related to environmental performance. © 2013 Elsevier B.V.
Resumo:
The paper represents a verification of a previously developed conceptual model of security related processes in DRM implementation. The applicability of established security requirements in practice is checked as well by comparing these requirements to four real DRM implementations (Microsoft Media DRM, Apple's iTunes, SunnComm Technologies’s MediaMax DRM and First4Internet’s XCP DRM). The exploited weaknesses of these systems resulting from the violation of specific security requirements are explained and the possibilities to avoid the attacks by implementing the requirements in designing step are discussed.
Resumo:
Besides their well-described use as delivery systems for water-soluble drugs, liposomes have the ability to act as a solubilizing agent for drugs with low aqueous solubility. However, a key limitation in exploiting liposome technology is the availability of scalable, low-cost production methods for the preparation of liposomes. Here we describe a new method, using microfluidics, to prepare liposomal solubilising systems which can incorporate low solubility drugs (in this case propofol). The setup, based on a chaotic advection micromixer, showed high drug loading (41 mol%) of propofol as well as the ability to manufacture vesicles with at prescribed sizes (between 50 and 450 nm) in a high-throughput setting. Our results demonstrate the ability of merging liposome manufacturing and drug encapsulation in a single process step, leading to an overall reduced process time. These studies emphasise the flexibility and ease of applying lab-on-a-chip microfluidics for the solubilisation of poorly water-soluble drugs.
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.
Resumo:
Recreational fisheries in North America are valued between $47.3 billion and $56.8 billion. Fisheries managers must make strategic decisions based on sound science and knowledge of population ecology, to effectively conserve populations. Competitive fishing, in the form of tournaments, has become an important part of recreational fisheries, and is common on large waterbodies including the Great Lakes. Black Bass, Micropterus spp., are top predators and among the most sought after species in competitive catch-and-release tournaments. This study investigated catch-and-release tournaments as an assessment tool through mark-recapture for Largemouth Bass (>305mm) populations in the Tri Lakes, and Bay of Quinte, part of the eastern basin of Lake Ontario. The population in the Tri Lakes (1999-2002) was estimated to be stable between 21,928-29,780, and the population in the Bay of Quinte (2012-2015) was estimated to be between 31,825-54,029 fish. Survival in the Tri Lakes varied throughout the study period, from 31%-54%; while survival in the Bay of Quinte remained stable at 63%. Differences in survival may be due to differences in fishing pressure, as 34-46% of the Largemouth Bass population on the Tri Lakes is harvested annually and only 19% of catch was attributed to tournament angling. Many biological issues still surround catch-and-release tournaments, particularly concerning displacement from initial capture sites. In the past, the majority of studies have focused on small inland lakes and coastal areas, displacing bass relatively short distances. My study displaced Largemouth and Smallmouth Bass up to 100km, and found very low rates of return; only 1 of 18 Largemouth Bass returned 15 km and 1 of 18 Smallmouth Bass returned 135 km. Both species remained near the release sites for an average of approximately 2 weeks prior to dispersing. Tournament organizers should consider the use of satellite release locations to facilitate dispersal and prevent stockpiling at the release site. Catch-and-release tournaments proved to be a valuable tool in assessing population variables and the effects of long distance displacement through the use of mark recapture and acoustic telemetry on large lake systems.
Resumo:
Christian Mikovits