15 resultados para Verification and validation technology
em Digital Commons at Florida International University
Resumo:
Distance learning is growing and transforming educational institutions. The increasing use of distance learning by higher education institutions and particularly community colleges coupled with the higher level of student attrition in online courses than in traditional classrooms suggests that increased attention should be paid to factors that affect online student course completion. The purpose of the study was to develop and validate an instrument to predict community college online student course completion based on faculty perceptions, yielding a prediction model of online course completion rates. Social Presence and Media Richness theories were used to develop a theoretically-driven measure of online course completion. This research study involved surveying 311 community college faculty who taught at least one online course in the past 2 years. Email addresses of participating faculty were provided by two south Florida community colleges. Each participant was contacted through email, and a link to an Internet survey was given. The survey response rate was 63% (192 out of 303 available questionnaires). Data were analyzed through factor analysis, alpha reliability, and multiple regression. The exploratory factor analysis using principal component analysis with varimax rotation yielded a four-factor solution that accounted for 48.8% of the variance. Consistent with Social Presence theory, the factors with their percent of variance in parentheses were: immediacy (21.2%), technological immediacy (11.0%), online communication and interactivity (10.3%), and intimacy (6.3%). Internal consistency of the four factors was calculated using Cronbach's alpha (1951) with reliability coefficients ranging between .680 and .828. Multiple regression analysis yielded a model that significantly predicted 11% of the variance of the dependent variable, the percentage of student who completed the online course. As indicated in the literature (Johnson & Keil, 2002; Newberry, 2002), Media Richness theory appears to be closely related to Social Presence theory. However, elements from this theory did not emerge in the factor analysis.
Resumo:
The purpose of this descriptive study was to evaluate the banking and insurance technology curriculum at ten junior colleges in Taiwan. The study focused on curriculum, curriculum materials, instruction, support services, student achievement and job performance. Data was collected from a diverse sample of faculty, students, alumni, and employers. ^ Questionnaires on the evaluation of curriculum at technical junior colleges were developed for use in this specific case. Data were collected from the sample described above and analyzed utilizing ANOVA, T-Tests and crosstabulations. Findings are presented which indicate that there is room for improvement in terms of meeting individual students' needs. ^ Using Stufflebeam's CIPP model for curriculum evaluation it was determined that the curriculum was adequate in terms of the knowledge and skills imparted to students. However, students were dissatisfied with the rigidity of the curriculum and the lack of opportunity to satisfy the individual needs of students. Employers were satisfied with both the academic preparation of students and their on the job performance. ^ In sum, the curriculum of the two-year banking and insurance technology programs of junior college in Taiwan was shown to have served adequately preparing a work force to enter businesses. It is now time to look toward the future and adapt the curriculum and instruction for the future needs of the ever evolving high-tech society. ^
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
This dissertation evaluated the feasibility of using commercially available immortalized cell lines in building a tissue engineered in vitro blood-brain barrier (BBB) co-culture model for preliminary drug development studies. Mouse endothelial cell line and rat astrocyte cell lines purchased from American Type Culture Collections (ATCC) were the building blocks of the co-culture model. An astrocyte derived acellular extracellular matrix (aECM) was introduced in the co-culture model to provide a novel in vitro biomimetic basement membrane for the endothelial cells to form endothelial tight junctions. Trans-endothelial electrical resistance (TEER) and solute mass transport studies were engaged to quantitatively evaluate the tight junction formation on the in-vitro BBB models. Immuno-fluorescence microscopy and Western Blot analysis were used to qualitatively verify the in vitro expression of occludin, one of the earliest discovered tight junction proteins. Experimental data from a total of 12 experiments conclusively showed that the novel BBB in vitro co-culture model with the astrocyte derived aECM (CO+aECM) was promising in terms of establishing tight junction formation represented by TEER values, transport profiles and tight junction protein expression when compared with traditional co-culture (CO) model setups and endothelial cells cultured alone. Experimental data were also found to be comparable with several existing in vitro BBB models built from various methods. In vitro colorimetric sulforhodamine B (SRB) assay revealed that the co-cultured samples with aECM resulted in less cell loss on the basal sides of the insert membranes than that from traditional co-culture samples. The novel tissue engineering approach using immortalized cell lines with the addition of aECM was proven to be a relevant alternative to the traditional BBB in vitro modeling.
Resumo:
Automated information system design and implementation is one of the fastest changing aspects of the hospitality industry. During the past several years nothing has increased the professionalism or improved the productivity within the industry more than the application of computer technology. Intuitive software applications, deemed the first step toward making computers more people-literate, object-oriented programming, intended to more accurately model reality, and wireless communications are expected to play a significant role in future technological advancement.
Resumo:
BACKGROUND: The Pro Children Eating Habits Questionnaire has been evaluated as a valid and reliable tool in Europe to measure determinants of fruit and vegetable intake for children; however, it has not been validation for United States populations. The purpose of this study was to (1) assess the reliability and discrimination validity of fruit and vegetable correlates for the Pro Children Eating Habits Questionnaire; (2) investigate the predictive validity of determinants of fruit and vegetable consumption for multi-ethnic elementary school children; and, (3) to assess the association of social determinants with fruit and vegetable consumption. METHODS: One hundred and thirty elementary school students from the 3rd and 5th grades completed this cross-sectional study. RESULTS: Fruit and vegetable determinants, had satisfactory internal consistencies. No differences were found between the test and the retest for the individual questions with the exception of the question for mean perceived vegetable intake. The discriminatory validity indicated the questionnaire could show differences across grade and gender levels for barriers of fruit and vegetables but not for other factors. Grade together with gender explained barriers to eating fruit and vegetables. Greater availability of fruit in the home and school was associated with higher frequency of consumption. CONCLUSIONS: The results of this study indicate the Pro-Children Eating Habits Questionnaire may be a reliable and valid tool for assessing fruit and vegetable consumption of children in the United States.
Resumo:
This thesis extended previous research on critical decision making and problem solving by refining and validating a measure designed to assess the use of critical thinking and critical discussion in sociomoral dilemmas. The purpose of this thesis was twofold: 1) to refine the administration of the Critical Thinking Subscale of the CDP to elicit more adequate responses and for purposes of refining the coding and scoring procedures for the total measure, and 2) to collect preliminary data on the initial reliabilities of the measure. Subjects consisted of 40 undergraduate students at Florida International University. Results indicate that the use of longer probes on the Critical Thinking Subscale was more effective in eliciting adequate responses necessary for coding and evaluating the subjects performance. Analyses on the psychometric properties of the measure consisted of test-retest reliability and inter-rater reliability.
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.
Resumo:
Tropical Rainfall Measuring Mission (TRMM) rainfall retrieval algorithms are evaluated in tropical cyclones (TCs). Differences between the Precipitation Radar (PR) and TRMM Microwave Imager (TMI) retrievals are found to be related to the storm region (inner core vs. rainbands) and the convective nature of the precipitation as measured by radar reflectivity and ice scattering signature. In landfalling TCs, the algorithms perform differently depending on whether the rainfall is located over ocean, land, or coastal surfaces. Various statistical techniques are applied to quantify these differences and identify the discrepancies in rainfall detection and intensity. Ground validation is accomplished by comparing the landfalling storms over the Southeast US to the NEXRAD Multisensor Precipitation Estimates (MPE) Stage-IV product. Numerous recommendations are given to algorithm users and developers for applying and interpreting these algorithms in areas of heavy and widespread tropical rainfall such as tropical cyclones.
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
Three new technologies have been brought together to develop a miniaturized radiation monitoring system. The research involved (1) Investigation a new HgI$\sb2$ detector. (2) VHDL modeling. (3) FPGA implementation. (4) In-circuit Verification. The packages used included an EG&G's crystal(HgI$\sb2$) manufactured at zero gravity, the Viewlogic's VHDL and Synthesis, Xilinx's technology library, its FPGA implementation tool, and a high density device (XC4003A). The results show: (1) Reduced cycle-time between Design and Hardware implementation; (2) Unlimited Re-design and implementation using the static RAM technology; (3) Customer based design, verification, and system construction; (4) Well suited for intelligent systems. These advantages excelled conventional chip design technologies and methods in easiness, short cycle time, and price in medium sized VLSI applications. It is also expected that the density of these devices will improve radically in the near future. ^
Resumo:
Security remains a top priority for organizations as their information systems continue to be plagued by security breaches. This dissertation developed a unique approach to assess the security risks associated with information systems based on dynamic neural network architecture. The risks that are considered encompass the production computing environment and the client machine environment. The risks are established as metrics that define how susceptible each of the computing environments is to security breaches. ^ The merit of the approach developed in this dissertation is based on the design and implementation of Artificial Neural Networks to assess the risks in the computing and client machine environments. The datasets that were utilized in the implementation and validation of the model were obtained from business organizations using a web survey tool hosted by Microsoft. This site was designed as a host site for anonymous surveys that were devised specifically as part of this dissertation. Microsoft customers can login to the website and submit their responses to the questionnaire. ^ This work asserted that security in information systems is not dependent exclusively on technology but rather on the triumvirate people, process and technology. The questionnaire and consequently the developed neural network architecture accounted for all three key factors that impact information systems security. ^ As part of the study, a methodology on how to develop, train and validate such a predictive model was devised and successfully deployed. This methodology prescribed how to determine the optimal topology, activation function, and associated parameters for this security based scenario. The assessment of the effects of security breaches to the information systems has traditionally been post-mortem whereas this dissertation provided a predictive solution where organizations can determine how susceptible their environments are to security breaches in a proactive way. ^
Resumo:
Even though the popularity and usage of teleconferencing is evident primarily outside the lodging industry, lodging operators cannot choose to ifnore the role teleconferencing will play in meeting the changing needs of guests. The authors discuss the factors that spurred the growth of teleconferencing, the opportunities and threats faced by lodging operators, and suggestions for taking advantage of the technology.
Resumo:
In outsourcing relationships with China, the Electronic Manufacturing (EM) and Information Technology Services (ITS) industry in Taiwan may possess such advantages as the continuing growth of its production value, complete manufacturing supply chain, low production cost and a large-scale Chinese market, and language and culture similarity compared to outsourcing to other countries. Nevertheless, the Council for Economic Planning and Development of Executive Yuan (CEPD) found that Taiwan's IT services outsourcing to China is subject to certain constraints and might not be as successful as the EM outsourcing (Aggarwal, 2003; CEPD, 2004a; CIER, 2003; Einhorn and Kriplani, 2003; Kumar and Zhu, 2006; Li and Gao, 2003; MIC, 2006). Some studies examined this issue, but failed to (1) provide statistical evidence about lower prevalence rates of IT services outsourcing, and (2) clearly explain the lower prevalence rates of IT services outsourcing by identifying similarities and differences between both types of outsourcing contexts. This research seeks to fill that gap and possibly provide potential strategic guidelines to ITS firms in Taiwan. This study adopts Transaction Cost Economics (TCE) as the theoretical basis. The basic premise is that different types of outsourcing activities may incur differing transaction costs and realize varying degrees of outsourcing success due to differential attributes of the transactions in the outsourcing process. Using primary data gathered from questionnaire surveys of ninety two firms, the results from exploratory analysis and binary logistic regression indicated that (1) when outsourcing to China, Taiwanese firms' ITS outsourcing tends to have higher level of asset specificity, uncertainty and technical skills relative to EM outsourcing, and these features indirectly reduce firms' outsourcing prevalence rates via their direct positive impacts on transaction costs; (2) Taiwanese firms' ITS outsourcing tends to have lower level of transaction structurability relative to EM outsourcing, and this feature indirectly increases firms' outsourcing prevalence rates via its direct negative impacts on transaction costs; (3) frequency does influence firms' transaction costs in ITS outsourcing positively, but does not bring impacts into their outsourcing prevalence rates, (4) relatedness does influence firms' transaction costs positively and prevalence rates negatively in ITS outsourcing, but its impacts on the prevalence rates are not caused by the mediation effects of transaction costs, and (5) firm size of outsourcing provider does not affect firms' transaction costs, but does affect their outsourcing prevalence rates in ITS outsourcing directly and positively. Using primary data gathered from face-to-face interviews of executives from seven firms, the results from inductive analysis indicated that (1) IT services outsourcing has lower prevalence rates than EM outsourcing, and (2) this result is mainly attributed to Taiwan's core competence in manufacturing and management and higher overall transaction costs of IT services outsourcing. Specifically, there is not much difference between both types of outsourcing context in the transaction characteristics of reputation and most aspects of overall comparison. Although there are some differences in the feature of firm size of the outsourcing provider, the difference doesn't cause apparent impacts on firms' overall transaction costs. The medium or above medium difference in the transaction characteristics of asset specificity, uncertainty, frequency, technical skills, transaction structurability, and relatedness has caused higher overall transaction costs for IT services outsourcing. This higher cost might cause lower prevalence rates for ITS outsourcing relative to EM outsourcing. Overall, the interview results are consistent with the statistical analyses and provide support to my expectation that in outsourcing to China, Taiwan's electronic manufacturing firms do have lower prevalence rates of IT services outsourcing relative to EM outsourcing due to higher transaction costs caused by certain attributes. To solve this problem, firms' management should aim at identifying alternative strategies and strive to reduce their overall transaction costs of IT services outsourcing by initiating appropriate strategies which fit their environment and needs.