12 resultados para Cement-based anodic system

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unmanned Aerial Vehicles (UAVs) may develop cracks, erosion, delamination or other damages due to aging, fatigue or extreme loads. Identifying these damages is critical for the safe and reliable operation of the systems. ^ Structural Health Monitoring (SHM) is capable of determining the conditions of systems automatically and continually through processing and interpreting the data collected from a network of sensors embedded into the systems. With the desired awareness of the systems’ health conditions, SHM can greatly reduce operational cost and speed up maintenance processes. ^ The purpose of this study is to develop an effective, low-cost, flexible and fault tolerant structural health monitoring system. The proposed Index Based Reasoning (IBR) system started as a simple look-up-table based diagnostic system. Later, Fast Fourier Transformation analysis and neural network diagnosis with self-learning capabilities were added. The current version is capable of classifying different health conditions with the learned characteristic patterns, after training with the sensory data acquired from the operating system under different status. ^ The proposed IBR systems are hierarchy and distributed networks deployed into systems to monitor their health conditions. Each IBR node processes the sensory data to extract the features of the signal. Classifying tools are then used to evaluate the local conditions with health index (HI) values. The HI values will be carried to other IBR nodes in the next level of the structured network. The overall health condition of the system can be obtained by evaluating all the local health conditions. ^ The performance of IBR systems has been evaluated by both simulation and experimental studies. The IBR system has been proven successful on simulated cases of a turbojet engine, a high displacement actuator, and a quad rotor helicopter. For its application on experimental data of a four rotor helicopter, IBR also performed acceptably accurate. The proposed IBR system is a perfect fit for the low-cost UAVs to be the onboard structural health management system. It can also be a backup system for aircraft and advanced Space Utility Vehicles. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hazardous radioactive liquid waste is the legacy of more than 50 years of plutonium production associated with the United States' nuclear weapons program. It is estimated that more than 245,000 tons of nitrate wastes are stored at facilities such as the single-shell tanks (SST) at the Hanford Site in the state of Washington, and the Melton Valley storage tanks at Oak Ridge National Laboratory (ORNL) in Tennessee. In order to develop an innovative, new technology for the destruction and immobilization of nitrate-based radioactive liquid waste, the United State Department of Energy (DOE) initiated the research project which resulted in the technology known as the Nitrate to Ammonia and Ceramic (NAC) process. However, inasmuch as the nitrate anion is highly mobile and difficult to immobilize, especially in relatively porous cement-based grout which has been used to date as a method for the immobilization of liquid waste, it presents a major obstacle to environmental clean-up initiatives. Thus, in an effort to contribute to the existing body of knowledge and enhance the efficacy of the NAC process, this research involved the experimental measurement of the rheological and heat transfer behaviors of the NAC product slurry and the determination of the optimal operating parameters for the continuous NAC chemical reaction process. Test results indicate that the NAC product slurry exhibits a typical non-Newtonian flow behavior. Correlation equations for the slurry's rheological properties and heat transfer rate in a pipe flow have been developed; these should prove valuable in the design of a full-scale NAC processing plant. The 20-percent slurry exhibited a typical dilatant (shear thickening) behavior and was in the turbulent flow regime due to its lower viscosity. The 40-percent slurry exhibited a typical pseudoplastic (shear thinning) behavior and remained in the laminar flow regime throughout its experimental range. The reactions were found to be more efficient in the lower temperature range investigated. With respect to leachability, the experimental final NAC ceramic waste form is comparable to the final product of vitrification, the technology chosen by DOE to treat these wastes. As the NAC process has the potential of reducing the volume of nitrate-based radioactive liquid waste by as much as 70 percent, it not only promises to enhance environmental remediation efforts but also effect substantial cost savings. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current demands for accountability in education emphasize outcome-based program evaluation and tie program funding to individual student performance. As has been the case for elementary and secondary programs, demands for accountability have increased pressure on adult educators to show evidence of the benefits of their programs in order to justify their financial support. In Florida, recent legislation fundamentally changes the delivery of adult education in the state by establishing a performance-based funding system that is based on outcomes related to the retention, completion, and employment of program participants.^ A performance-based funding system requires an evaluation process that stresses outcome indicators over indicators that focus on program context or process. Although the state has adopted indicators of program quality to evaluate its adult education programs, these indicators focus mostly on program processes rather than student outcomes. In addition, the indicators are not specifically tied to workforce development outcomes, a priority to federal and local funding agents.^ Improving the accountability of adult education programs and defining the role of these programs in Florida's Workforce Development System has become a priority to policy makers across the state. Another priority has been to involve adult education practitioners in every step of this process.^ This study was conducted in order to determine what performance indicators, as judged by the directors and supervisors of adult education programs in the state of Florida, are important and feasible in measuring the quality and effectiveness of these programs. The results of the study indicated that, both statewide and by region, the respondents consistently gave the highest ratings on both importance and feasibility to the indicators of Program Context, which reflect the needs, composition, and structure of the programs, and to the indicators of Educational Gain, which reflect learner progress in the attainment of basic skills and competencies. In turn, the respondents gave the lowest ratings on both importance and feasibility to the indicators in the areas of Return on State's Investment, Efficiency, Retention, and Workforce Training. In general, the indicators that received high ratings for importance also received high ratings for feasibility. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface Plasmon Resonance (SPR) and localized surface plasmon resonance (LSPR) biosensors have brought a revolutionary change to in vitro study of biological and biochemical processes due to its ability to measure extremely small changes in surface refractive index (RI), binding equilibrium and kinetics. Strategies based on LSPR have been employed to enhance the sensitivity for a variety of applications, such as diagnosis of diseases, environmental analysis, food safety, and chemical threat detection. In LSPR spectroscopy, absorption and scattering of light are greatly enhanced at frequencies that excite the LSPR, resulting in a characteristic extinction spectrum that depends on the RI of the surrounding medium. Compositional and conformational change within the surrounding medium near the sensing surface could therefore be detected as shifts in the extinction spectrum. This dissertation specifically focuses on the development and evaluation of highly sensitive LSPR biosensors for in situ study of biomolecular binding process by incorporating nanotechnology. Compared to traditional methods for biomolecular binding studies, LSPR-based biosensors offer real-time, label free detection. First, we modified the gold sensing surface of LSPR-based biosensors using nanomaterials such as gold nanoparticles (AuNPs) and polymer to enhance surface absorption and sensitivity. The performance of this type of biosensors was evaluated on the application of small heavy metal molecule binding affinity study. This biosensor exhibited ∼7 fold sensitivity enhancement and binding kinetics measurement capability comparing to traditional biosensors. Second, a miniaturized cell culture system was integrated into the LSPR-based biosensor system for the purpose of real-time biomarker signaling pathway studies and drug efficacy studies with living cells. To the best of our knowledge, this is the first LSPR-based sensing platform with the capability of living cell studies. We demonstrated the living cell measurement ability by studying the VEGF signaling pathway in living SKOV-3 cells. Results have shown that the VEGF secretion level from SKOV-3 cells is 0.0137 ± 0.0012 pg per cell. Moreover, we have demonstrated bevacizumab drug regulation to the VEGF signaling pathway using this biosensor. This sensing platform could potentially help studying biomolecular binding kinetics which elucidates the underlying mechanisms of biotransportation and drug delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface Plasmon Resonance (SPR) and localized surface plasmon resonance (LSPR) biosensors have brought a revolutionary change to in vitro study of biological and biochemical processes due to its ability to measure extremely small changes in surface refractive index (RI), binding equilibrium and kinetics. Strategies based on LSPR have been employed to enhance the sensitivity for a variety of applications, such as diagnosis of diseases, environmental analysis, food safety, and chemical threat detection. In LSPR spectroscopy, absorption and scattering of light are greatly enhanced at frequencies that excite the LSPR, resulting in a characteristic extinction spectrum that depends on the RI of the surrounding medium. Compositional and conformational change within the surrounding medium near the sensing surface could therefore be detected as shifts in the extinction spectrum. This dissertation specifically focuses on the development and evaluation of highly sensitive LSPR biosensors for in situ study of biomolecular binding process by incorporating nanotechnology. Compared to traditional methods for biomolecular binding studies, LSPR-based biosensors offer real-time, label free detection. First, we modified the gold sensing surface of LSPR-based biosensors using nanomaterials such as gold nanoparticles (AuNPs) and polymer to enhance surface absorption and sensitivity. The performance of this type of biosensors was evaluated on the application of small heavy metal molecule binding affinity study. This biosensor exhibited ~7 fold sensitivity enhancement and binding kinetics measurement capability comparing to traditional biosensors. Second, a miniaturized cell culture system was integrated into the LSPR-based biosensor system for the purpose of real-time biomarker signaling pathway studies and drug efficacy studies with living cells. To the best of our knowledge, this is the first LSPR-based sensing platform with the capability of living cell studies. We demonstrated the living cell measurement ability by studying the VEGF signaling pathway in living SKOV-3 cells. Results have shown that the VEGF secretion level from SKOV-3 cells is 0.0137 ± 0.0012 pg per cell. Moreover, we have demonstrated bevacizumab drug regulation to the VEGF signaling pathway using this biosensor. This sensing platform could potentially help studying biomolecular binding kinetics which elucidates the underlying mechanisms of biotransportation and drug delivery.