842 resultados para performance-based engineering


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the study was to investigate the physiological and psychological benefits provided by a self-selected health and wellness course on a racially and ethnically diverse student population. It was designed to determine if students from a 2-year Hispanic serving institution (HIS) from a large metropolitan area would enhance their capacity to perform physical activities, increase their knowledge of health topics and raise their exercise self-efficacy after completing a course that included educational and activity components for a period of 16 weeks. A total of 185 students voluntarily agreed to participate in the study. An experimental group was selected from six sections of a health and wellness course, and a comparison group from students in a student life skills course. All participants were given anthropometric tests of physical fitness, a knowledge test, and an exercise self-efficacy scale was given at the beginning and at the conclusion of the semester. An ANCOVA analyses with the pretest scores being the covariate and the dependent variable being the difference score, indicated a significant improvement of the experimental group in five of the seven anthropometric tests over the comparison group. In addition, the experimental group increased in two of the three sections of the exercise self-efficacy scale indicating greater confidence to participate in physical activities in spite of barriers over the comparison group. The experimental group also increased in knowledge of health related topics over the comparison group at the .05 significance level. Results indicated beneficial outcomes gained by students enrolled in a 16-week health and wellness course. The study has several implications for practitioners, faculty members, educational policy makers and researchers in terms of implementation of strategies to promote healthy behaviors in college students and, to encourage them to engage in regular physical activities throughout their college years.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent advances in the electric & hybrid electric vehicles and rapid developments in the electronic devices have increased the demand for high power and high energy density lithium ion batteries. Graphite (theoretical specific capacity: 372 mAh/g) used in commercial anodes cannot meet these demands. Amorphous SnO2 anodes (theoretical specific capacity: 781 mAh/g) have been proposed as alternative anode materials. But these materials have poor conductivity, undergo a large volume change during charging and discharging, large irreversible capacity loss leading to poor cycle performances. To solve the issues related to SnO2 anodes, we propose to synthesize porous SnO2 composites using electrostatic spray deposition technique. First, porous SnO2/CNT composites were fabricated and the effects of the deposition temperature (200, 250, 300 °C) & CNT content (10, 20, 30, 40 wt %) on the electrochemical performance of the anodes were studied. Compared to pure SnO2 and pure CNT, the composite materials as anodes showed better discharge capacity and cyclability. 30 wt% CNT content and 250 °C deposition temperature were found to be the optimal conditions with regard to energy capacity whereas the sample with 20% CNT deposited at 250 °C exhibited good capacity retention. This can be ascribed to the porous nature of the anodes and the improvement in the conductivity by the addition of CNT. Electrochemical impedance spectroscopy studies were carried out to study in detail the change in the surface film resistance with cycling. By fitting EIS data to an equivalent circuit model, the values of the circuit components, which represent surface film resistance, were obtained. The higher the CNT content in the composite, lower the change in surface film resistance at certain voltage upon cycling. The surface resistance increased with the depth of discharge and decreased slightly at fully lithiated state. Graphene was also added to improve the performance of pure SnO2 anodes. The composites heated at 280 °C showed better energy capacity and energy density. The specific capacities of as deposited and post heat-treated samples were 534 and 737 mAh/g after 70 cycles. At the 70th cycle, the energy density of the composites at 195 °C and 280 °C were 1240 and 1760 Wh/kg, respectively, which are much higher than the commercially used graphite electrodes (37.2–74.4 Wh/kg). Both SnO2/CNTand SnO2/grapheme based composites with improved energy densities and capacities than pure SnO2 can make a significant impact on the development of new batteries for electric vehicles and portable electronics applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In - Appraising Work Group Performance: New Productivity Opportunities in Hospitality Management – a discussion by Mark R. Edwards, Associate Professor, College of Engineering, Arizona State University and Leslie Edwards Cummings, Assistant Professor, College of Hotel Administration University of Nevada, Las Vegas; the authors initially provide: “Employee group performance variation accounts for a significant portion of the degree of productivity in the hotel, motel, and food service sectors of the hospitality industry. The authors discuss TEAMSG, a microcomputer based approach to appraising and interpreting group performance. TEAMSG appraisal allows an organization to profile and to evaluate groups, facilitating the targeting of training and development decisions and interventions, as well as the more equitable distribution of organizational rewards.” “The caliber of employee group performance is a major determinant in an organization's productivity and success within the hotel and food service industries,” Edwards and Cummings say. “Gaining accurate information about the quality of performance of such groups as organizational divisions, individual functional departments, or work groups can be as enlightening...” the authors further reveal. This perspective is especially important not only for strategic human resources planning purposes, but also for diagnosing development needs and for differentially distributing organizational rewards.” The authors will have you know, employee requirements in an unpredictable environment, which is what the hospitality industry largely is, are difficult to quantify. In an effort to measure elements of performance Edwards and Cummings look to TEAMSG, which is an acronym for Team Evaluation and Management System for Groups. They develop the concept. In discussing background for employees, Edwards and Cummings point-out that employees - at the individual level - must often possess and exercise varied skills. In group circumstances employees often work at locations outside of, or move from corporate unit-to-unit, as in the case of a project team. Being able to transcend individual-to-group mentality is imperative. “A solution which addresses the frustration and lack of motivation on the part of the employee is to coach, develop, appraise, and reward employees on the basis of group achievement,” say the authors. “An appraisal, effectively developed and interpreted, has at least three functions,” Edwards and Cummings suggest, and go on to define them. The authors do place a great emphasis on rewards and interventions to bolster the assertion set forth in their thesis statement. Edwards and Cummings warn that individual agendas can threaten, erode, and undermine group performance; there is no - I - in TEAM.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Drug targeting is an active area of research and nano-scaled drug delivery systems hold tremendous potential for the treatment of neoplasms. In this study, a novel cyclodextrin (CD)-based nanoparticle drug delivery system has been assembled and characterized for the therapy of folate receptor-positive [FR(+)] cancer. Water-soluble folic acid (FA)-conjugated CD carriers (FACDs) were successfully synthesized and their structures were confirmed by 1D/2D nuclear magnetic resonance (NMR), matrix-assisted laser desorption ionization time-of-flight mass spectrometer (MALDI-TOF-MS), high performance liquid chromatography (HPLC), Fourier transform infrared spectroscopy (FTIR), and circular dichroism. Drug complexes of adamatane (Ada) and cytotoxic doxorubicin (Dox) with FACD were readily obtained by mixed solvent precipitation. The average size of FACD-Ada-Dox was 1.5–2.5 nm. The host-guest association constant Ka was 1,639 M−1 as determined by induced circular dichroism and the hydrophilicity of the FACDs was greatly enhanced compared to unmodified CD. Cellular uptake and FR binding competitive experiments demonstrated an efficient and preferentially targeted delivery of Dox into FR-positive tumor cells and a sustained drug release profile was seen in vitro. The delivery of Dox into FR(+) cancer cells via endocytosis was observed by confocal microscopy and drug uptake of the targeted nanoparticles was 8-fold greater than that of non-targeted drug complexes. Our docking results suggest that FA, FACD and FACD-Ada-Dox could bind human hedgehog interacting protein that contains a FR domain. Mouse cardiomyocytes as well as fibroblast treated with FACD-Ada-Dox had significantly lower levels of reactive oxygen species, with increased content of glutathione and glutathione peroxidase activity, indicating a reduced potential for Dox-induced cardiotoxicity. These results indicate that the targeted drug complex possesses high drug association and sustained drug release properties with good biocompatibility and physiological stability. The novel FA-conjugated β-CD based drug complex might be promising as an anti-tumor treatment for FR(+) cancer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conceptual database design is an unusually difficult and error-prone task for novice designers. This study examined how two training approaches---rule-based and pattern-based---might improve performance on database design tasks. A rule-based approach prescribes a sequence of rules for modeling conceptual constructs, and the action to be taken at various stages while developing a conceptual model. A pattern-based approach presents data modeling structures that occur frequently in practice, and prescribes guidelines on how to recognize and use these structures. This study describes the conceptual framework, experimental design, and results of a laboratory experiment that employed novice designers to compare the effectiveness of the two training approaches (between-subjects) at three levels of task complexity (within subjects). Results indicate an interaction effect between treatment and task complexity. The rule-based approach was significantly better in the low-complexity and the high-complexity cases; there was no statistical difference in the medium-complexity case. Designer performance fell significantly as complexity increased. Overall, though the rule-based approach was not significantly superior to the pattern-based approach in all instances, it out-performed the pattern-based approach at two out of three complexity levels. The primary contributions of the study are (1) the operationalization of the complexity construct to a degree not addressed in previous studies; (2) the development of a pattern-based instructional approach to database design; and (3) the finding that the effectiveness of a particular training approach may depend on the complexity of the task.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was threefold: first, to investigate variables associated with learning, and performance as measured by the National Council Licensure Examination for Registered Nurses (NCLEX-RN). The second purpose was to validate the predictive value of the Assessment Technologies Institute (ATI) achievement exit exam, and lastly, to provide a model that could be used to predict performance on the NCLEX-RN, with implications for admission and curriculum development. The study was based on school learning theory, which implies that acquisition in school learning is a function of aptitude (pre-admission measures), opportunity to learn, and quality of instruction (program measures). Data utilized were from 298 graduates of an associate degree nursing program in the Southeastern United States. Of the 298 graduates, 142 were Hispanic, 87 were Black, non-Hispanic, 54 White, non-Hispanic, and 15 reported as Others. The graduates took the NCLEX-RN for the first time during the years 2003–2005. This study was a predictive, correlational design that relied upon retrospective data. Point biserial correlations, and chi-square analyses were used to investigate relationships between 19 selected predictor variables and the dichotomous criterion variable, NCLEX-RN. The correlation and chi square findings indicated that men did better on the NCLEX-RN than women; Blacks had the highest failure rates, followed by Hispanics; older students were more likely to pass the exam than younger students; and students who passed the exam started and completed the nursing program with a higher grade point average, than those who failed the exam. Using logistic regression, five statistical models that used variables associated with learning and student performance on the NCLEX-RN were tested with a model adapted from Bloom's (1976) and Carroll's (1963) school learning theories. The derived model included: NCLEX-RNsuccess = f (Nurse Entrance Test and advanced medical-surgical nursing course grade achieved). The model demonstrates that student performance on the NCLEX-RN can be predicted by one pre-admission measure, and a program measure. The Assessment Technologies Institute achievement exit exam (an outcome measure) had no predictive value for student performance on the NCLEX-RN. The model developed accurately predicted 94% of the student's successful performance on the NCLEX-RN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hardware/software (HW/SW) cosimulation integrates software simulation and hardware simulation simultaneously. Usually, HW/SW co-simulation platform is used to ease debugging and verification for very large-scale integration (VLSI) design. To accelerate the computation of the gesture recognition technique, an HW/SW implementation using field programmable gate array (FPGA) technology is presented in this paper. The major contributions of this work are: (1) a novel design of memory controller in the Verilog Hardware Description Language (Verilog HDL) to reduce memory consumption and load on the processor. (2) The testing part of the neural network algorithm is being hardwired to improve the speed and performance. The American Sign Language gesture recognition is chosen to verify the performance of the approach. Several experiments were carried out on four databases of the gestures (alphabet signs A to Z). (3) The major benefit of this design is that it takes only few milliseconds to recognize the hand gesture which makes it computationally more efficient.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using multiple regression analysis, lodging managers’ annual mean salaries in 143 Metropolitan Statistical Areas (MSA) within the U.S. were analyzed to identify what relationships existed with variables related to general MSA characteristics, along with the lodging industry’s size and performance. By examining the relationship between these variables, the authors predict the long-term possibility of predicting lodging industry managers’ salaries. These predictions may have an impact on financial performance of an individual lodging property or organization. Through this paper, this concept was applied and explored within U.S. MSAs. These findings may have value for a variety of stakeholders, including human resources practitioners, the hospitality education community, and individuals considering lodging management careers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Peripheral nerves have demonstrated the ability to bridge gaps of up to 6 mm. Peripheral Nerve System injury sites beyond this range need autograft or allograft surgery. Central Nerve System cells do not allow spontaneous regeneration due to the intrinsic environmental inhibition. Although stem cell therapy seems to be a promising approach towards nerve repair, it is essential to use the distinct three-dimensional architecture of a cell scaffold with proper biomolecule embedding in order to ensure that the local environment can be controlled well enough for growth and survival. Many approaches have been developed for the fabrication of 3D scaffolds, and more recently, fiber-based scaffolds produced via the electrospinning have been garnering increasing interest, as it offers the opportunity for control over fiber composition, as well as fiber mesh porosity using a relatively simple experimental setup. All these attributes make electrospun fibers a new class of promising scaffolds for neural tissue engineering. Therefore, the purpose of this doctoral study is to investigate the use of the novel material PGD and its derivative PGDF for obtaining fiber scaffolds using the electrospinning. The performance of these scaffolds, combined with neural lineage cells derived from ESCs, was evaluated by the dissolvability test, Raman spectroscopy, cell viability assay, real time PCR, Immunocytochemistry, extracellular electrophysiology, etc. The newly designed collector makes it possible to easily obtain fibers with adequate length and integrity. The utilization of a solvent like ethanol and water for electrospinning of fibrous scaffolds provides a potentially less toxic and more biocompatible fabrication method. Cell viability testing demonstrated that the addition of gelatin leads to significant improvement of cell proliferation on the scaffolds. Both real time PCR and Immunocytochemistry analysis indicated that motor neuron differentiation was achieved through the high motor neuron gene expression using the metabolites approach. The addition of Fumaric acid into fiber scaffolds further promoted the differentiation. Based on the results, this newly fabricated electrospun fiber scaffold, combined with neural lineage cells, provides a potential alternate strategy for nerve injury repair.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis describes the development of an adaptive control algorithm for Computerized Numerical Control (CNC) machines implemented in a multi-axis motion control board based on the TMS320C31 DSP chip. The adaptive process involves two stages: Plant Modeling and Inverse Control Application. The first stage builds a non-recursive model of the CNC system (plant) using the Least-Mean-Square (LMS) algorithm. The second stage consists of the definition of a recursive structure (the controller) that implements an inverse model of the plant by using the coefficients of the model in an algorithm called Forward-Time Calculation (FTC). In this way, when the inverse controller is implemented in series with the plant, it will pre-compensate for the modification that the original plant introduces in the input signal. The performance of this solution was verified at three different levels: Software simulation, implementation in a set of isolated motor-encoder pairs and implementation in a real CNC machine. The use of the adaptive inverse controller effectively improved the step response of the system in all three levels. In the simulation, an ideal response was obtained. In the motor-encoder test, the rise time was reduced by as much as 80%, without overshoot, in some cases. Even with the larger mass of the actual CNC machine, decrease of the rise time and elimination of the overshoot were obtained in most cases. These results lead to the conclusion that the adaptive inverse controller is a viable approach to position control in CNC machinery.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The applicability of carbon-based foams as an insulating or active cooling material in thermal protection systems (TPSs) of space vehicles is considered using a computer modeling. This study focuses on numerical investigation of the performance of carbon foams for use in TPSs of space vehicles. Two kinds of carbon foams are considered in this study. For active cooling, the carbon foam that has a thermal conductivity of 100 W/m-k is used and for the insulation, the carbon foam having a thermal conductivity of 0.225 W/m-k is used. A 3D geometry is employed to simulate coolant flow and heat transfer through carbon foam model. Gambit has been used to model the 3D geometry and the numerical simulation is carried out in FLUENT. Numerical results from this thesis suggests that the use of CFOAM and HTC carbon foams in TPS's may effectively protect the aluminum structure of the space shuttle during reentry of the space vehicle.