12 resultados para Second Life(software)
em Digital Commons at Florida International University
Resumo:
The population of English Language Learners (ELLs) globally has been increasing substantially every year. In the United States alone, adult ELLs are the fastest growing portion of learners in adult education programs (Yang, 2005). There is a significant need to improve the teaching of English to ELLs in the United States and other English-speaking dominant countries. However, for many ELLs, speaking, especially to Native English Speakers (NESs), causes considerable language anxiety, which in turn plays a vital role in hindering their language development and academic progress (Pichette, 2009; Woodrow, 2006). ^ Task-based Language Teaching (TBLT), such as simulation activities, has long been viewed as an effective approach for second-language development. The current advances in technology and rapid emergence of Multi-User Virtual Environments (MUVEs) have provided an opportunity for educators to consider conducting simulations online for ELLs to practice speaking English to NESs. Yet to date, empirical research on the effects of MUVEs on ELLs' language development and speaking is limited (Garcia-Ruiz, Edwards, & Aquino-Santos, 2007). ^ This study used a true experimental treatment control group repeated measures design to compare the perceived speaking anxiety levels (as measured by an anxiety scale administered per simulation activity) of 11 ELLs (5 in the control group, 6 in the experimental group) when speaking to Native English Speakers (NESs) during 10 simulation activities. Simulations in the control group were done face-to-face, while those in the experimental group were done in the MUVE of Second Life. ^ The results of the repeated measures ANOVA revealed after the Huynh-Feldt epsilon correction, demonstrated for both groups a significant decrease in anxiety levels over time from the first simulation to the tenth and final simulation. When comparing the two groups, the results revealed a statistically significant difference, with the experimental group demonstrating a greater anxiety reduction. These results suggests that language instructors should consider including face-to-face and MUVE simulations with ELLs paired with NESs as part of their language instruction. Future investigations should investigate the use of other multi-user virtual environments and/or measure other dimensions of the ELL/NES interactions.^
Resumo:
The population of English Language Learners (ELLs) globally has been increasing substantially every year. In the United States alone, adult ELLs are the fastest growing portion of learners in adult education programs (Yang, 2005). There is a significant need to improve the teaching of English to ELLs in the United States and other English-speaking dominant countries. However, for many ELLs, speaking, especially to Native English Speakers (NESs), causes considerable language anxiety, which in turn plays a vital role in hindering their language development and academic progress (Pichette, 2009; Woodrow, 2006). Task-based Language Teaching (TBLT), such as simulation activities, has long been viewed as an effective approach for second-language development. The current advances in technology and rapid emergence of Multi-User Virtual Environments (MUVEs) have provided an opportunity for educators to consider conducting simulations online for ELLs to practice speaking English to NESs. Yet to date, empirical research on the effects of MUVEs on ELLs’ language development and speaking is limited (Garcia-Ruiz, Edwards, & Aquino-Santos, 2007). This study used a true experimental treatment control group repeated measures design to compare the perceived speaking anxiety levels (as measured by an anxiety scale administered per simulation activity) of 11 ELLs (5 in the control group, 6 in the experimental group) when speaking to Native English Speakers (NESs) during 10 simulation activities. Simulations in the control group were done face-to-face, while those in the experimental group were done in the MUVE of Second Life. The results of the repeated measures ANOVA revealed after the Huynh-Feldt epsilon correction, demonstrated for both groups a significant decrease in anxiety levels over time from the first simulation to the tenth and final simulation. When comparing the two groups, the results revealed a statistically significant difference, with the experimental group demonstrating a greater anxiety reduction. These results suggests that language instructors should consider including face-to-face and MUVE simulations with ELLs paired with NESs as part of their language instruction. Future investigations should investigate the use of other multi-user virtual environments and/or measure other dimensions of the ELL/NES interactions.
Resumo:
This research examines evolving issues in applied computer science and applies economic and business analyses as well. There are two main areas. The first is internetwork communications as embodied by the Internet. The goal of the research is to devise an efficient pricing, prioritization, and incentivization plan that could be realistically implemented on the existing infrastructure. Criteria include practical and economic efficiency, and proper incentives for both users and providers. Background information on the evolution and functional operation of the Internet is given, and relevant literature is surveyed and analyzed. Economic analysis is performed on the incentive implications of the current pricing structure and organization. The problems are identified, and minimally disruptive solutions are proposed for all levels of implementation to the lowest level protocol. Practical issues are considered and performance analyses are done. The second area of research is mass market software engineering, and how this differs from classical software engineering. Software life-cycle revenues are analyzed and software pricing and timing implications are derived. A profit maximizing methodology is developed to select or defer the development of software features for inclusion in a given release. An iterative model of the stages of the software development process is developed, taking into account new communications capabilities as well as profitability. ^
Resumo:
Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^
Resumo:
This dissertation describes the findings and implications of a correlational analysis. Scores earned on the Computerized Placement Test (CPT), sentence skills, were compared to essay scores of advanced English as a Second Language (ESL) students. As the CPT is designed for native speakers of English, it was hypothesized that it could be an invalid or unreliable instrument for non-native speakers. Florida community college students are mandated to take the CPT to determine preparedness, as are students at many other U.S. and Canadian colleges. If incoming students score low on the CPT, they may be required to take up to three semesters of remedial coursework. It is essential that scores earned by non-native speakers of English accurately reflect their ability level. They constitute a large and growing body of non-traditional students enrolled at community colleges.^ The study was conducted at Miami-Dade Community College, Wolfson Campus, fall 1997. Participants included 106 advanced ESL students who took both the CPT sentence skills test and wrote final essay exams. The essay exams were holistically scored by trained readers. Also, the participants took the Placement Articulation Software Service (PASS) exam, an alternative form of the CPT. Scores on the CPT and essays were compared by means of a Pearson product-moment correlation to validate the CPT. Scores on the CPT and the PASS exam were compared in the same manner to verify reliability. A percentage of appropriate placements was determined by comparing essay scores to CPT cutoff score ranges. Finally, the instruments were evaluated by means of independent-samples t-tests for performance differences between gender, age, and first language groups.^ The results indicate that the CPT sentence skills test is a valid and reliable placement instrument for advanced- level ESL students who intend to pursue community college degrees. The correlations demonstrated a substantial relationship between CPT and essay scores and a marked relationship between CPT and PASS scores. Appropriate placements were made in 86% of the cases. Furthermore, the CPT was found to discriminate equally among the gender, age, and first language groups included in this study. ^
Resumo:
Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, (1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) .7%), borderline (HbA1c 7-8.9%), and poor (HbA1c .9%) glycemic control and potentially new risk factors (e.g. work characteristics), and (2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and (3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.^
Resumo:
This research examines the life pathways of 1.5 and second generation Haitian immigrants in South Florida. The purpose of the research is to better understand how integration occurs for the children of Haitian immigrants as they transition from adolescence to adulthood. Building upon a prior study of second-generation immigrant adolescents between 1995 and 2000, a sub-set of the original participants was located to participate in this follow-up research. Qualitative interviews were conducted as well as in-depth ethnographic research, including participant observation. Survey instruments used with other second-generation populations were also administered, enabling comparisons with the Children of Immigrants Longitudinal Study (CILS). The results indicate that educational and occupational achievements were markedly below the participants’ original expectations as adolescents. Gender figures prominently in participants’ familial roles and relationships, with men and women distinctly incorporating both Haitian and American cultural practices within their households. Contrary to previous research, these results on the identification of participants suggest that these young adults claim attachment to both Haiti and to the United States. The unique longitudinal and ethnographic nature of this study contributes to the ongoing discussion of the integration of the children of immigrants by demonstrating significant variation from the prior integration trends observed with Haitian adolescents. The results cast doubt on existing theory on the children of immigrants for explaining the trajectory of Haitian-American integration patterns. Specifically, this research indicates that Haitians are not downwardly mobile and integrating as African Americans. They have higher education and economic standing than their parents and are continuing their education well into their thirties. The respondents have multiple identities in which they increasingly express identification with Haiti, but in some contexts are also developing racialized identifications with African Americans and others of the African diaspora.
Resumo:
The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.
Resumo:
This dissertation makes a contribution to the growing literature on identity formation by formulating, implementing, and testing the effectiveness of a psychosocial intervention, the Making Life Choices (MLC) Workshops, designed to facilitate the process of identity formation. More specifically, the MLC Workshops were designed to foster the development and use of critical cognitive and communicative skills and competencies in choosing and fulfilling life goals and values. The MLC Workshops consist of a psychosocial group intervention that includes both didactic and group experiential exercises. The primary research question for this study concerned the effectiveness of the MLC Workshop relative to a control condition. Effectiveness was evaluated on two levels: skills development and reduction of distress. First, the effectiveness of MLC in fostering the development of critical competencies was evaluated relative to a control condition, and no statistically significant differences were found. Second, the effectiveness of MLC in decreasing life distress was also evaluated relative to the control condition. While participants in the MLC workshop had no significant decrease in distress, they did have statistically significant improvement in life satisfaction in the Personal Domain.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.