14 resultados para Design experimental

em Digital Commons at Florida International University


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The unprecedented increase in the number of older adults is expected to increase the burden of osteoporosis on the individual and society. Blacks have been understudied in osteoporosis prevention education research. Although the risk of osteoporosis is low in this population, its consequences are significant. This study employs a two-group experimental design (experimental and wait-list control groups) to evaluate the effect of an osteoporosis education on two osteoporosis prevention behaviors (OPBs)—calcium intake (CI) and physical activity (PA), in a group of community-dwelling Black older adults, 50 years and older resident in South Florida. A final sample of 110 (mean age 70.15 years), 90% female and 10% male completed a battery of questionnaires at two assessment periods. The experimental group participated in six weekly education program sessions immediately following baseline assessment, and the wait-list control group received the education following end of program assessment by all participants. The weekly educational sessions were conducted in social settings (church or senior center) employing constructs of the Revised Health Belief Model. The sessions focused on improving CI; osteoporosis knowledge (OKT), self-efficacy (SE), health beliefs (HB) and PA. Findings revealed significantly greater increase in reported CI ( M = 556 mg, Wilks’ λ = .47, F (1,108)=122.97, p< .001, η2=.53), OKT (p< .001), and SE (p< .001) among participants in the experimental compared to the wait-list control group. There was no significant difference between the two groups for PA and most of the HB subscales. OKT and SE were the best predictors of CI, while perceived barrier was a predominant factor predicting PA. Over the study period, a change in SE was the only variable related to changes in both OPBs. Attrition rate was lower than expected, which can be attributed to the settings utilized for the study. These findings support the importance of utilizing a familiar social setting. These results suggested the effectiveness of a program offered in multiple short sessions among this underserved minority population to improve OKT and SE resulting in a change in OPBs (increase in CI). However, there is need to explore alternative strategies to improve PA in this population group.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The unprecedented increase in the number of older adults is expected to increase the burden of osteoporosis on the individual and society. Blacks have been understudied in osteoporosis prevention education research. Although the risk of osteoporosis is low in this population, its consequences are significant. This study employs a two-group experimental design (experimental and wait-list control groups) to evaluate the effect of an osteoporosis education on two osteoporosis prevention behaviors (OPBs) – calcium intake (CI) and physical activity (PA), in a group of community-dwelling Black older adults, 50 years and older resident in South Florida. A final sample of 110 (mean age 70.15 years), 90% female and 10% male completed a battery of questionnaires at two assessment periods. The experimental group participated in six weekly education program sessions immediately following baseline assessment, and the wait-list control group received the education following end of program assessment by all participants. The weekly educational sessions were conducted in social settings (church or senior center) employing constructs of the Revised Health Belief Model. The sessions focused on improving CI; osteoporosis knowledge (OKT), self-efficacy (SE), health beliefs (HB) and PA. Findings revealed significantly greater increase in reported CI (M = 556 mg, Wilks’ λ = .47, F(1,108)=122.97, p< .001, η2=.53), OKT (p< .001), and SE (p< .001) among participants in the experimental compared to the wait-list control group. There was no significant difference between the two groups for PA and most of the HB subscales. OKT and SE were the best predictors of CI, while perceived barrier was a predominant factor predicting PA. Over the study period, a change in SE was the only variable related to changes in both OPBs. Attrition rate was lower than expected, which can be attributed to the settings utilized for the study. These findings support the importance of utilizing a familiar social setting. These results suggested the effectiveness of a program offered in multiple short sessions among this underserved minority population to improve OKT and SE resulting in a change in OPBs (increase in CI). However, there is need to explore alternative strategies to improve PA in this population group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parameter design is an experimental design and analysis methodology for developing robust processes and products. Robustness implies insensitivity to noise disturbances. Subtle experimental realities, such as the joint effect of process knowledge and analysis methodology, may affect the effectiveness of parameter design in precision engineering; where the objective is to detect minute variation in product and process performance. In this thesis, approaches to statistical forced-noise design and analysis methodologies were investigated with respect to detecting performance variations. Given a low degree of process knowledge, Taguchi's methodology of signal-to-noise ratio analysis was found to be more suitable in detecting minute performance variations than the classical approach based on polynomial decomposition. Comparison of inner-array noise (IAN) and outer-array noise (OAN) structuring approaches showed that OAN is a more efficient design for precision engineering. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vapor phase carbon adsorption systems are used to remove aromatics, aliphatics, and halogenated hydrocarbons. The adsorption capacity of granular activated carbon is reduced when environmental parameters (temperature, pressure, and humidity) interfere with homogeneous surface diffusion and pore distribution dynamics. The purpose of this study was to investigate the effects of parametric uncertainties in adsorption efficiency. ^ Modified versions of the Langmuir isotherm in conjunction with thermodynamic equations described gaseous adsorption of single component influent onto microporous media. Experimental test results derived from Wang et al. (1999) simulated adsorption kinetics while the Myer and monsoon Langmuir constant accounted for isothermal gas compression and energetic heterogeneity under thermodynamic equilibrium conditions. Responsiveness of adsorption capacity to environmental uncertainties was analyzed by statistical sensitivity and modeled by breakthrough curves. Results indicated that extensive fluctuations in adsorption capacity significantly reduced carbon consumption while isothermal variations had a pronounced effect on saturation capacity. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluated how changes in nutrient supply altered the composition of epiphytic and benthic microalgal communities in a Thalassia testudinum (turtle grass) bed in Florida Bay. We established study plots at four sites in the bay and added nitrogen (N) and phosphorus (P) to the sediments in a factorial design. After 18, 24, and 30 months of fertilization we measured the pigment concentrations in the epiphytic and benthic microalgal assemblages using high performance liquid chromatography. Overall, the epiphytic assemblage was P-limited in the eastern portion of the bay, but each phototrophic group displayed unique spatial and temporal responses to N and P addition. Epiphytic chlorophyll a, an indicator of total microalgal load, and epiphytic fucoxanthin, an indicator of diatoms, increased in response to P addition at one eastern bay site, decreased at another eastern bay site, and were not affected by P or N addition at two western bay sites. Epiphytic zeaxanthin, an indicator of the cyanobacteria/coralline red algae complex, and epiphytic chlorophyll b, an indicator of green algae, generally increased in response to P addition at both eastern bay sites but did not respond to P or N addition in the western bay. Benthic chlorophyll a, chlorophyll b, fucoxanthin, and zeaxanthin showed complex responses to N and P addition in the eastern bay, suggesting that the benthic assemblage is limited by both N and P. Benthic assemblages in the western bay were variable over time and displayed few responses to N or P addition. The contrasting nutrient limitation patterns between the epiphytic and benthic communities in the eastern bay suggest that altering nutrient input to the bay, as might occur during Everglades restoration, can shift microalgal community structure, which may subsequently alter food web support for upper trophic levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hurricane is one of the most destructive and costly natural hazard to the built environment and its impact on low-rise buildings, particularity, is beyond acceptable. The major objective of this research was to perform a parametric evaluation of internal pressure (IP) for wind-resistant design of low-rise buildings and wind-driven natural ventilation applications. For this purpose, a multi-scale experimental, i.e. full-scale at Wall of Wind (WoW) and small-scale at Boundary Layer Wind Tunnel (BLWT), and a Computational Fluid Dynamics (CFD) approach was adopted. This provided new capability to assess wind pressures realistically on internal volumes ranging from small spaces formed between roof tiles and its deck to attic to room partitions. Effects of sudden breaching, existing dominant openings on building envelopes as well as compartmentalization of building interior on the IP were systematically investigated. Results of this research indicated: (i) for sudden breaching of dominant openings, the transient overshooting response was lower than the subsequent steady state peak IP and internal volume correction for low-wind-speed testing facilities was necessary. For example a building without volume correction experienced a response four times faster and exhibited 30–40% lower mean and peak IP; (ii) for existing openings, vent openings uniformly distributed along the roof alleviated, whereas one sided openings aggravated the IP; (iii) larger dominant openings exhibited a higher IP on the building envelope, and an off-center opening on the wall exhibited (30–40%) higher IP than center located openings; (iv) compartmentalization amplified the intensity of IP and; (v) significant underneath pressure was measured for field tiles, warranting its consideration during net pressure evaluations. The study aimed at wind driven natural ventilation indicated: (i) the IP due to cross ventilation was 1.5 to 2.5 times higher for Ainlet/Aoutlet>1 compared to cases where Ainlet/Aoutlet<1, this in effect reduced the mixing of air inside the building and hence the ventilation effectiveness; (ii) the presence of multi-room partitioning increased the pressure differential and consequently the air exchange rate. Overall good agreement was found between the observed large-scale, small-scale and CFD based IP responses. Comparisons with ASCE 7-10 consistently demonstrated that the code underestimated peak positive and suction IP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design, construction and optimization of a low power-high temperature heated ceramic sensor to detect leaking of halogen gases in refrigeration systems are presented. The manufacturing process was done with microelectronic assembly and the Low Temperature Cofire Ceramic (LTCC) technique. Four basic sensor materials were fabricated and tested: Li2SiO3, Na2SiO3, K2SiO3, and CaSiO 3. The evaluation of the sensor material, sensor size, operating temperature, bias voltage, electrodes size, firing temperature, gas flow, and sensor life was done. All sensors responded to the gas showing stability and reproducibility. Before exposing the sensor to the gas, the sensor was modeled like a resistor in series and the calculations obtained were in agreement with the experimental values. The sensor response to the gas was divided in surface diffusion and bulk diffusion; both were analyzed showing agreement between the calculations and the experimental values. The sensor with 51.5%CaSiO3 + 48.5%Li 2SiO3 shows the best results, including a stable current and response to the gas. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design, construction and optimization of a low power-high temperature heated ceramic sensor to detect leaking of halogen gases in refrigeration systems are presented. The manufacturing process was done with microelectronic assembly and the Low Temperature Cofire Ceramic (LTCC) technique. Four basic sensor materials were fabricated and tested: Li2SiO3, Na2SiO3, K2SiO3, and CaSiO3. The evaluation of the sensor material, sensor size, operating temperature, bias voltage, electrodes size, firing temperature, gas flow, and sensor life was done. All sensors responded to the gas showing stability and reproducibility. Before exposing the sensor to the gas, the sensor was modeled like a resistor in series and the calculations obtained were in agreement with the experimental values. The sensor response to the gas was divided in surface diffusion and bulk diffusion; both were analyzed showing agreement between the calculations and the experimental values. The sensor with 51.5%CaSiO3 + 48.5%Li2SiO3 shows the best results, including a stable current and response to the gas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.