906 resultados para Experimental performance metrics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluated how changes in nutrient supply altered the composition of epiphytic and benthic microalgal communities in a Thalassia testudinum (turtle grass) bed in Florida Bay. We established study plots at four sites in the bay and added nitrogen (N) and phosphorus (P) to the sediments in a factorial design. After 18, 24, and 30 months of fertilization we measured the pigment concentrations in the epiphytic and benthic microalgal assemblages using high performance liquid chromatography. Overall, the epiphytic assemblage was P-limited in the eastern portion of the bay, but each phototrophic group displayed unique spatial and temporal responses to N and P addition. Epiphytic chlorophyll a, an indicator of total microalgal load, and epiphytic fucoxanthin, an indicator of diatoms, increased in response to P addition at one eastern bay site, decreased at another eastern bay site, and were not affected by P or N addition at two western bay sites. Epiphytic zeaxanthin, an indicator of the cyanobacteria/coralline red algae complex, and epiphytic chlorophyll b, an indicator of green algae, generally increased in response to P addition at both eastern bay sites but did not respond to P or N addition in the western bay. Benthic chlorophyll a, chlorophyll b, fucoxanthin, and zeaxanthin showed complex responses to N and P addition in the eastern bay, suggesting that the benthic assemblage is limited by both N and P. Benthic assemblages in the western bay were variable over time and displayed few responses to N or P addition. The contrasting nutrient limitation patterns between the epiphytic and benthic communities in the eastern bay suggest that altering nutrient input to the bay, as might occur during Everglades restoration, can shift microalgal community structure, which may subsequently alter food web support for upper trophic levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the study was to investigate the physiological and psychological benefits provided by a self-selected health and wellness course on a racially and ethnically diverse student population. It was designed to determine if students from a 2-year Hispanic serving institution (HIS) from a large metropolitan area would enhance their capacity to perform physical activities, increase their knowledge of health topics and raise their exercise self-efficacy after completing a course that included educational and activity components for a period of 16 weeks. A total of 185 students voluntarily agreed to participate in the study. An experimental group was selected from six sections of a health and wellness course, and a comparison group from students in a student life skills course. All participants were given anthropometric tests of physical fitness, a knowledge test, and an exercise self-efficacy scale was given at the beginning and at the conclusion of the semester. An ANCOVA analyses with the pretest scores being the covariate and the dependent variable being the difference score, indicated a significant improvement of the experimental group in five of the seven anthropometric tests over the comparison group. In addition, the experimental group increased in two of the three sections of the exercise self-efficacy scale indicating greater confidence to participate in physical activities in spite of barriers over the comparison group. The experimental group also increased in knowledge of health related topics over the comparison group at the .05 significance level. Results indicated beneficial outcomes gained by students enrolled in a 16-week health and wellness course. The study has several implications for practitioners, faculty members, educational policy makers and researchers in terms of implementation of strategies to promote healthy behaviors in college students and, to encourage them to engage in regular physical activities throughout their college years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental evidence suggests that derived relational responding (DRR) may provide a behavioral model of complex language phenomena. This study assigned 72 students to groups based upon their performance on a complex relational task. It was found that performance on DRR relates to scores on the WAIS-III.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Questions: How are the early survival and growth of seedlings of Everglades tree species planted in an experimental setting on artificial tree islands affected by hydrology and substrate type? What are the implications of these responses for broader tree island restoration efforts? Location: Loxahatchee Impoundment Landscape Assessment (LILA), Boynton Beach, Florida, USA. Methods: An experiment was designed to test hydrological and substrate effects on seedling growth and survivorship. Two islands – a peat and a limestone-core island representing two major types found in the Everglades – were constructed in four macrocosms. A mixture of eight tree species was planted on each island in March of 2006 and 2007. Survival and height growth of seedlings planted in 2006 were assessed periodically during the next two and a half years. Results: Survival and growth improved with increasing elevation on both tree island substrate types. Seedlings' survival and growth responses along a moisture gradient matched species distributions along natural hydrological gradients in the Everglades. The effect of substrate on seedling performance showed higher survival of most species on the limestone tree islands, and faster growth on their peat-based counterparts. Conclusions: The present results could have profound implications for restoration of forests on existing landforms and artificial creation of tree islands. Knowledge of species tolerance to flooding and responses to different edaphic conditions present in wetlands is important in selecting suitable species to plant on restored tree islands

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Core strength training (CST) has been popular in the fitness industry for a decade. Although strong core muscles are believed to enhance athletic performance, only few scientific studies have been conducted to identify the effectiveness of CST on improving athletic performance. Objective: Identify the effects of a 6-wk CST on running kinetics, lower extremity stability, and running performance in recreational and competitive runners. Design and Setting: A test-retest, randomized control design was used to assess the effect of CST and no CST on ground reaction force (GRF), lower extremity stability scores, and running performance. Participants: Twenty-eight healthy adults (age, 36.9+9.4yrs, height, 168.4+9.6cm, mass, 70.1+15.3kg) were recruited and randomly divided into two groups. Main outcome Measures: GRF was determined by calculating peak impact vertical GRF (vGRF), peak active vGRF, duration of the breaking or horizontal GRF (hGRF), and duration of the propulsive hGRF as measured while running across a force plate. Lower extremity stability in three directions (anterior, posterior, lateral) was assessed using the Star Excursion Balance Test (SEBT). Running performance was determined by 5000 meter run measured on selected outdoor tracks. Six 2 (time) X 2 (condition) mixed-design ANOVA were used to determine if CST influences on each dependent variable, p < .05. Results: No significant interactions were found for any kinetic variables and SEBT score, p>.05. But 5000m run time showed significant interaction, p < .05. SEBT scores improved in both groups, but more in the experimental group. Conclusion: CST did not significantly influence kinetic efficiency and lower extremity stability, but did influence running performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was conducted to determine if the use of the technology known as Classroom Performance System (CPS), specifically referred to as "Clickers", improves the learning gains of students enrolled in a biology course for science majors. CPS is one of a group of developing technologies adapted for providing feedback in the classroom using a learner-centered approach. It supports and facilitates discussion among students and between them and teachers, and provides for participation by passive students. Advocates, influenced by constructivist theories, claim increased academic achievement. In science teaching, the results have been mixed, but there is some evidence of improvements in conceptual understanding. The study employed a pretest-posttest, non-equivalent groups experimental design. The sample consisted of 226 participants in six sections of a college biology course at a large community college in South Florida with two instructors trained in the use of clickers. Each instructor randomly selected their sections into CPS (treatment) and non-CPS (control) groups. All participants filled out a survey that included demographic data at the beginning of the semester. The treatment group used clicker questions throughout, with discussions as necessary, whereas the control groups answered the same questions as quizzes, similarly engaging in discussion where necessary. The learning gains were assessed on a pre/post-test basis. The average learning gains, defined as the actual gain divided by the possible gain, were slightly better in the treatment group than in the control group, but the difference was statistically non-significant. An Analysis of Covariance (ANCOVA) statistic with pretest scores as the covariate was conducted to test for significant differences between the treatment and control groups on the posttest. A second ANCOVA was used to determine the significance of differences between the treatment and control groups on the posttest scores, after controlling for sex, GPA, academic status, experience with clickers, and instructional style. The results indicated a small increase in learning gains but these were not statistically significant. The data did not support an increase in learning based on the use of the CPS technology. This study adds to the body of research that questions whether CPS technology merits classroom adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conceptual database design is an unusually difficult and error-prone task for novice designers. This study examined how two training approaches---rule-based and pattern-based---might improve performance on database design tasks. A rule-based approach prescribes a sequence of rules for modeling conceptual constructs, and the action to be taken at various stages while developing a conceptual model. A pattern-based approach presents data modeling structures that occur frequently in practice, and prescribes guidelines on how to recognize and use these structures. This study describes the conceptual framework, experimental design, and results of a laboratory experiment that employed novice designers to compare the effectiveness of the two training approaches (between-subjects) at three levels of task complexity (within subjects). Results indicate an interaction effect between treatment and task complexity. The rule-based approach was significantly better in the low-complexity and the high-complexity cases; there was no statistical difference in the medium-complexity case. Designer performance fell significantly as complexity increased. Overall, though the rule-based approach was not significantly superior to the pattern-based approach in all instances, it out-performed the pattern-based approach at two out of three complexity levels. The primary contributions of the study are (1) the operationalization of the complexity construct to a degree not addressed in previous studies; (2) the development of a pattern-based instructional approach to database design; and (3) the finding that the effectiveness of a particular training approach may depend on the complexity of the task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of a multi-university research program funded by NSF, a comprehensive experimental and analytical study of seismic behavior of hybrid fiber reinforced polymer (FRP)-concrete column is presented in this dissertation. Experimental investigation includes cyclic tests of six large-scale concrete-filled FRP tube (CFFT) and RC columns followed by monotonic flexural tests, a nondestructive evaluation of damage using ultrasonic pulse velocity in between the two test sets and tension tests of sixty-five FRP coupons. Two analytical models using ANSYS and OpenSees were developed and favorably verified against both cyclic and monotonic flexural tests. The results of the two methods were compared. A parametric study was also carried out to investigate the effect of three main parameters on primary seismic response measures. The responses of typical CFFT columns to three representative earthquake records were also investigated. The study shows that only specimens with carbon FRP cracked, whereas specimens with glass or hybrid FRP did not show any visible cracks throughout cyclic tests. Further monotonic flexural tests showed that carbon specimens both experienced flexural cracks in tension and crumpling in compression. Glass or hybrid specimens, on the other hand, all showed local buckling of FRP tubes. Compared with conventional RC columns, CFFT column possesses higher flexural strength and energy dissipation with an extended plastic hinge region. Among all CFFT columns, the hybrid lay-up demonstrated the highest flexural strength and initial stiffness, mainly because of its high reinforcement index and FRP/concrete stiffness ratio, respectively. Moreover, at the same drift ratio, the hybrid lay-up was also considered as the best in term of energy dissipation. Specimens with glassfiber tubes, on the other hand, exhibited the highest ductility due to better flexibility of glass FRP composites. Furthermore, ductility of CFFTs showed a strong correlation with the rupture strain of FRP. Parametric study further showed that different FRP architecture and rebar types may lead to different failure modes for CFFT columns. Transient analysis of strong ground motions showed that the column with off-axis nonlinear filament-wound glass FRP tube exhibited a superior seismic performance to all other CFFTs. Moreover, higher FRP reinforcement ratios may lead to a brittle system failure, while a well-engineered FRP reinforcement configuration may significantly enhance the seismic performance of CFFT columns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents an experimental investigation of thermal hydraulic performance of the nanofluid composed by graphene nanoparticles dispersed in a mixture of water and ethylene glycol at a ratio of 70:30% by volume. The tests were carried out under forced convection inside a circular tube with uniform heat flux on the wall for the laminar-turbulent transition regime. The mass flow rate ranged from 40 to 70 g/s corresponding to Reynolds numbers between 3000 and 7500. The heat flux was maintained constant at values of 11, 16 and 21 kW/m², as well as the inlet temperature of 15, 20 and 25°C. Three samples were produced with the nanofluid volumetric concentration of 0.05%, 0.10% and 0.15%. Thermophysical properties were experimentaly measured for all samples that were critically compared and discussed with theoretical models most commonly used in the literature. Initially, experiments with distilled water confirmed the validity of the experimental equipment for the thermo-hydraulic tests. Therefore, nanofluid samples that showed the highest thermal conductivity, corresponding to the volumetric concentrations of 0.15% and 0.10%, were subjected to the tests. The thermal-hydraulic performance for both samples was unsatisfactory. The heat transfer coefficients for convection of nanofluids reduced 21% in average, for the sample with = 0.15% and 26% and for =0.10%. The pressure drop of the samples was higher than the base fluid. Finally, the pressure drop and heat transfer coefficient by convection of both samples were also compared to theoretical models. The models used for pressure drop showed an excellent agreement with experimental results, which is remarkable considering the transitional flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wave measurement is of vital importance for assessing the wave power resources and for developing wave energy devices, especially for the wave energy production and the survivability of the wave energy device. Wave buoys are one of the most popular measuring technologies developed and used for long-term wave measurements. In order to figure out whether the wave characteristics can be recorded by using the wave buoys accurately, an experimental study was carried out on the performance of three wave buoy models, viz two WaveScan buoys and one ODAS buoy, in a wave tank using the European FP7 MARINET facilities. This paper presents the test results in both time and frequency domains and the comparison between the wave buoys and wave gauge measurements. The analysis results reveal that for both regular and irregular waves, the WaveScan buoys have better performances than the ODAS buoy in terms of accuracy and the WaveScan buoys measurements have a very good correlation with those from the wave gauges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.