14 resultados para Number system

em Digital Commons at Florida International University


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computers have dramatically changed the way we live, conduct business, and deliver education. They have infiltrated the Bahamian public school system to the extent that many educators now feel the need for a national plan. The development of such a plan is a challenging undertaking, especially in developing countries where physical, financial, and human resources are scarce. This study assessed the situation with regard to computers within the Bahamian public school system, and provided recommended guidelines to the Bahamian government based on the results of a survey, the body of knowledge about trends in computer usage in schools, and the country's needs. ^ This was a descriptive study for which an extensive review of literature in areas of computer hardware, software, teacher training, research, curriculum, support services and local context variables was undertaken. One objective of the study was to establish what should or could be relative to the state-of-the-art in educational computing. A survey was conducted involving 201 teachers and 51 school administrators from 60 randomly selected Bahamian public schools. A random stratified cluster sampling technique was used. ^ This study used both quantitative and qualitative research methodologies. Quantitative methods were used to summarize the data about numbers and types of computers, categories of software available, peripheral equipment, and related topics through the use of forced-choice questions in a survey instrument. Results of these were displayed in tables and charts. Qualitative methods, data synthesis and content analysis, were used to analyze the non-numeric data obtained from open-ended questions on teachers' and school administrators' questionnaires, such as those regarding teachers' perceptions and attitudes about computers and their use in classrooms. Also, interpretative methodologies were used to analyze the qualitative results of several interviews conducted with senior public school system's officials. Content analysis was used to gather data from the literature on topics pertaining to the study. ^ Based on the literature review and the data gathered for this study a number of recommendations are presented. These recommendations may be used by the government of the Commonwealth of The Bahamas to establish policies with regard to the use of computers within the public school system. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Run-off-road (ROR) crashes have increasingly become a serious concern for transportation officials in the State of Florida. These types of crashes have increased proportionally in recent years statewide and have been the focus of the Florida Department of Transportation. The goal of this research was to develop statistical models that can be used to investigate the possible causal relationships between roadway geometric features and ROR crashes on Florida's rural and urban principal arterials. ^ In this research, Zero-Inflated Poisson (ZIP) and Zero-Inflated Negative Binomial (ZINB) Regression models were used to better model the excessive number of roadway segments with no ROR crashes. Since Florida covers a diverse area and since there are sixty-seven counties, it was divided into four geographical regions to minimize possible unobserved heterogeneity. Three years of crash data (2000–2002) encompassing those for principal arterials on the Florida State Highway System were used. Several statistical models based on the ZIP and ZINB regression methods were fitted to predict the expected number of ROR crashes on urban and rural roads for each region. Each region was further divided into urban and rural areas, resulting in a total of eight crash models. A best-fit predictive model was identified for each of these eight models in terms of AIC values. The ZINB regression was found to be appropriate for seven of the eight models and the ZIP regression was found to be more appropriate for the remaining model. To achieve model convergence, some explanatory variables that were not statistically significant were included. Therefore, strong conclusions cannot be derived from some of these models. ^ Given the complex nature of crashes, recommendations for additional research are made. The interaction of weather and human condition would be quite valuable in discerning additional causal relationships for these types of crashes. Additionally, roadside data should be considered and incorporated into future research of ROR crashes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rates of survival of victims of sudden cardiac arrest (SCA) using cardio pulmonary resuscitation (CPR) have shown little improvement over the past three decades. Since registered nurses (RNs) comprise the largest group of healthcare providers in U.S. hospitals, it is essential that they are competent in performing the four primary measures (compression, ventilation, medication administration, and defibrillation) of CPR in order to improve survival rates of SCA patients. The purpose of this experimental study was to test a color-coded SMOCK system on: 1) time to implement emergency patient care measures 2) technical skills performance 3) number of medical errors, and 4) team performance during simulated CPR exercises. The study sample was 260 RNs (M 40 years, SD=11.6) with work experience as an RN (M 7.25 years, SD=9.42).Nurses were allocated to a control or intervention arm consisting of 20 groups of 5-8 RNs per arm for a total of 130 RNs in each arm. Nurses in each study arm were given clinical scenarios requiring emergency CPR. Nurses in the intervention group wore different color labeled aprons (smocks) indicating their role assignment (medications, ventilation, compression, defibrillation, etc) on the code team during CPR. Findings indicated that the intervention using color-labeled smocks for pre-assigned roles had a significant effect on the time nurses started compressions (t=3.03, p=0.005), ventilations (t=2.86, p=0.004) and defibrillations (t=2.00, p=.05) when compared to the controls using the standard of care. In performing technical skills, nurses in the intervention groups performed compressions and ventilations significantly better than those in the control groups. The control groups made significantly (t=-2.61, p=0.013) more total errors (7.55 SD 1.54) than the intervention group (5.60, SD 1.90). There were no significant differences in team performance measures between the groups. Study findings indicate use of colored labeled smocks during CPR emergencies resulted in: shorter times to start emergency CPR; reduced errors; more technical skills completed successfully; and no differences in team performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural crest cells originate from the dorsal most region of the embryonic neural tube. These cells migrate into several embryonic locations and differentiate into a variety of cell types. Cardiac neural crest (CNC) cells are a set of neural crest progenitors that aid in the proper formation of the cardiac septum, which separates the pulmonary from the systemic circulation. We have used Splotch mice to investigate whether the murine CNC cells play a role during the development oft he myocardium and the conduction system. Splotch mice carry a mutation in the P AX3 transcription factor, and display a problem in CNC cell migration. A scanning-electron-microscopy analysis of Splotch mutant-embryonic-hearts reveals abnormalities in the interventricular septum. In addition, the right and left ventricular cavities appear dilated relative to a wild type heart. Hoechst nuclei staining of Splotch heart cryosections demonstrates a decreased number of cardiomyocytes and a corresponding thinner ventricular wall. The absence of Connexin 40 in the ventricles of Splotch mutants, suggests conduction system defects. These results support the evidence that CNC cell signaling plays a role in modulating the growth and development of murine cardiomyocytes and their differentiation into conductile cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective interaction with personal computers is a basic requirement for many of the functions that are performed in our daily lives. With the rapid emergence of the Internet and the World Wide Web, computers have become one of the premier means of communication in our society. Unfortunately, these advances have not become equally accessible to physically handicapped individuals. In reality, a significant number of individuals with severe motor disabilities, due to a variety of causes such as Spinal Cord Injury (SCI), Amyothrophic Lateral Sclerosis (ALS), etc., may not be able to utilize the computer mouse as a vital input device for computer interaction. The purpose of this research was to further develop and improve an existing alternative input device for computer cursor control to be used by individuals with severe motor disabilities. This thesis describes the development and the underlying principle for a practical hands-off human-computer interface based on Electromyogram (EMG) signals and Eye Gaze Tracking (EGT) technology compatible with the Microsoft Windows operating system (OS). Results of the software developed in this thesis show a significant improvement in the performance and usability of the EMG/EGT cursor control HCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to develop a GIS-based multi-class index overlay model to determine areas susceptible to inland flooding during extreme precipitation events in Broward County, Florida. Data layers used in the method include Airborne Laser Terrain Mapper (ALTM) elevation data, excess precipitation depth determined through performing a Soil Conservation Service (SCS) Curve Number (CN) analysis, and the slope of the terrain. The method includes a calibration procedure that uses "weights and scores" criteria obtained from Hurricane Irene (1999) records, a reported 100-year precipitation event, Doppler radar data and documented flooding locations. Results are displayed in maps of Eastern Broward County depicting types of flooding scenarios for a 100-year, 24-hour storm based on the soil saturation conditions. As expected the results of the multi-class index overlay analysis showed that an increase for the potential of inland flooding could be expected when a higher antecedent moisture condition is experienced. The proposed method proves to have some potential as a predictive tool for flooding susceptibility based on a relatively simple approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to examine the factorsbehind the failure rates of Associate in Arts (AA)graduates from Miami-Dade Community College (M-DCC) transferring to the Florida State University System (SUS). In M-DCC's largest disciplines, the university failure rate was 13% for Business & Management, 13% for Computer Science, and 14% for Engineering. Hypotheses tested were: Hypothesis 1 (H1): The lower division (LD) overall cumulative GPA and/or the LD major field GPA for AA graduates are predictive of the SUS GPA for the Business Management, Computer Science, and Engineering disciplines. Hypothesis 2 (H2): Demographic variables (age, race, gender) are predictive of performance at the university among M-DCC AA graduates in Engineering, Business & Management, and Computer Science. Hypothesis 3 (H3): Administrative variables (CLAST -College Level Academic Skills Test subtests) are predictive of university performance (GPA) for the Business/Management, Engineering, and Computer Science disciplines. Hypothesis 4 (H4): LD curriculum variables (course credits, course quality points) are predictive of SUS performance for the Engineering, Business/Management and Computer Science disciplines. Multiple Regression was the inferential procedureselected for predictions. Descriptive statistics weregenerated on the predictors. Results for H1 identified the LD GPA as the most significant variable in accounting for the variability of the university GPA for the Business & Management, Computer Science, and Engineering disciplines. For H2, no significant results were obtained for theage and gender variables, but the ethnic subgroups indicated significance at the .0001 level. However, differentials in GPA may not have been due directly to the race factor but, rather, to curriculum choices and performance outcomes while in the LD. The CLAST computation variable (H3) was a significant predictor of the SUS GPA. This is most likely due to the mathematics structure pervasive in these disciplines. For H4, there were two curriculum variables significant in explaining the variability of the university GPA (number of required critical major credits completed and quality of the student's performance for these credits). Descriptive statistics on the predictors indicated that 78% of those failing in the State University System had a LD major GPA (calculated with the critical required university credits earned and quality points of these credits) of less than 3.0; and 83% of those failing at the university had an overall community college GPA of less than 3.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rates of survival of victims of sudden cardiac arrest (SCA) using cardio pulmonary resuscitation (CPR) have shown little improvement over the past three decades. Since registered nurses (RNs) comprise the largest group of healthcare providers in U.S. hospitals, it is essential that they are competent in performing the four primary measures (compression, ventilation, medication administration, and defibrillation) of CPR in order to improve survival rates of SCA patients. The purpose of this experimental study was to test a color-coded SMOCK system on:1) time to implement emergency patient care measures 2) technical skills performance 3) number of medical errors, and 4) team performance during simulated CPR exercises. The study sample was 260 RNs (M 40 years, SD=11.6) with work experience as an RN (M 7.25 years, SD=9.42).Nurses were allocated to a control or intervention arm consisting of 20 groups of 5-8 RNs per arm for a total of 130 RNs in each arm. Nurses in each study arm were given clinical scenarios requiring emergency CPR. Nurses in the intervention group wore different color labeled aprons (smocks) indicating their role assignment (medications, ventilation, compression, defibrillation, etc) on the code team during CPR. Findings indicated that the intervention using color-labeled smocks for pre-assigned roles had a significant effect on the time nurses started compressions (t=3.03, p=0.005), ventilations (t=2.86, p=0.004) and defibrillations (t=2.00, p=.05) when compared to the controls using the standard of care. In performing technical skills, nurses in the intervention groups performed compressions and ventilations significantly better than those in the control groups. The control groups made significantly (t=-2.61, p=0.013) more total errors (7.55 SD 1.54) than the intervention group (5.60, SD 1.90). There were no significant differences in team performance measures between the groups. Study findings indicate use of colored labeled smocks during CPR emergencies resulted in: shorter times to start emergency CPR; reduced errors; more technical skills completed successfully; and no differences in team performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.