25 resultados para Modeling techniques
em Digital Commons at Florida International University
Resumo:
This study evaluated the relative fit of both Finn's (1989) Participation-Identification and Wehlage, Rutter, Smith, Lesko and Fernandez's (1989) School Membership models of high school completion to a sample of 4,597 eighth graders taken from the National Educational Longitudinal Study of 1988, (NELS:88), utilizing structural equation modeling techniques. This study found support for the importance of educational engagement as a factor in understanding academic achievement. The Participation-Identification model was particularly well fitting when applied to the sample of high school completers, dropouts (both overall and White dropouts) and African-American students. This study also confirmed the contribution of school environmental factors (i.e., size, diversity of economic and ethnic status among students) and family resources (i.e., availability of learning resources in the home and parent educational level) to students' educational engagement. Based on these findings, school social workers will need to be more attentive to utilizing macro-level interventions (i.e., community organization, interagency coordination) to achieve the organizational restructuring needed to address future challenges. The support found for the Participation-Identification model supports a shift in school social workers' attention from reactive attempts to improve the affective-interpersonal lives of students to proactive attention to their academic lives. The model concentrates school social work practices on the central mission of schools, which is educational engagement. School social workers guided by this model would be encouraged to seek changes in school policies and organization that would facilitate educational engagement. ^
Resumo:
Sexual victimization of young women typically occurs within a context of alcohol use, such that women are more likely to be victimized on days on which they consume alcohol compared to days on which no alcohol is consumed. Additionally, most research on sexual victimization of women has focused on forced sexual acts; consequently, little is known about forms sexual victimization that college women typically experience, such as brief (e.g., unwanted touching) or verbally coerced experiences (e.g., doing sexual things to prevent a partner from leaving). Finally, there is a need for more research on the processes underlying college women's drinking and the specific mechanisms through which drinking increases risk for sexual victimization. This dissertation sought to replicate recent findings of a temporal association between alcohol use and sexual victimization, and to investigate whether or not binge use increased risk for victimization, within a sample of young Hispanic college women, using repeated-measures logistic regression. This study also aimed to identify and explore typologies of victimization experiences in order to better understand types of sexual victimization common among young college women. Finally, the validity of a model of alcohol use and sexual victimization was investigated using structural equation modeling techniques. The results confirmed and extended previous research by demonstrating an increase in the conditional probability of sexual victimization on days of alcohol consumption compared with days of no alcohol consumption, and on days of binge alcohol consumption compared with days of moderate alcohol consumption. Sexual victimization experiences reported in this study were diverse, and cluster analysis was used to identify and explore specific typologies of victimization experiences, including intimate relationship victimization, brief victimization with stranger, prolonged victimization with acquaintance, and workplace victimization. The results from structural equation modeling (SEM) analyses were complex and helped to illuminate the relationships between reasons for drinking, alcohol use, childhood sexual abuse, sexual victimization, psychopathology, and acculturation-related factors among Hispanic college women. These findings have implications for the design of university-based prevention and intervention efforts aimed at reducing rates of alcohol-related sexual victimization within Hispanic populations.
Resumo:
This study identifies and describes HIV Voluntary Counseling and Testing (VCT) of middle aged and older Latinas. The rate of new cases of HIV in people age 45 and older is rapidly increasing, with a 40.6% increase in the numbers of older Latinas infected with HIV between 1998 and 2002. Despite this increase, there is paucity of research on this population. This research seeks to address the gap through a secondary data analysis of Latina women. The aim of this study is twofold: (1) Develop and empirically test a multivariate model of VCT utilization for middle aged and older Latinas; (2) To test how the three individual components of the Andersen Behavioral Model impact VCT for middle aged and older Latinas. The study is organized around the three major domains of the Andersen Behavioral Model of service use that include: (a) predisposing factors; (b) enabling characteristics and (c) need. Logistic regression using structural equation modeling techniques were used to test multivariate relationships of variables on VCT for a sample of 135 middle age and older Latinas residing in Miami-Dade County, Florida. Over 60% of participants had been tested for HIV. Provider endorsement was found to he the strongest predictor of VCT (odds ration [OR] 6.38), followed by having a clinic as a regular source of healthcare (OR=3.88). Significant negative associations with VCT included self rated health status (OR=.592); Age (OR=.927); Spanish proficiency (OR=.927); number of sexual partners (OR=.613) and consumption of alcohol during sexual activity (.549). As this line of inquiry provides a critical glimpse into the VCT of older Latinas, recommendations for enhanced service provision and research will he offered.
Resumo:
The distinctive karstic, freshwater wetlands of the northern Caribbean and Central American region support the prolific growth of calcite-rich periphyton mats. Aside from the Everglades, very little research has been conducted in these karstic wetlands, which are increasingly threatened by eutrophication. This study sought to (i) test the hypothesis that water depth and periphyton total phosphorus (TP) content are both drivers of periphyton biomass in karstic wetland habitats in Belize, Mexico and Jamaica, (ii) provide a taxonomic inventory of the periphytic diatom species in these wetlands and (iii) examine the relationship between periphyton mat TP concentration and diatom assemblage at Everglades and Caribbean locations. ^ Periphyton biomass, nutrient and diatom assemblage data were generated from periphyton mat samples collected from shallow, marl-based wetlands in Belize, Mexico and Jamaica. These data were compared to a larger dataset collected from comparable sites within Everglades National Park. A diatom taxonomic inventory was conducted on the Caribbean samples and a combination of ordination and weighted-averaging modeling techniques were used to compare relationships between periphyton TP concentration, periphyton biomass and diatom assemblage composition among the locations. ^ Within the Everglades, periphyton biomass showed a negative correlation with water depth and mat TP, while periphyton mat percent organic content was positively correlated with these two variables. These patterns were also exhibited within the Belize, Mexico and Jamaica locations, suggesting that water depth and periphyton TP content are both drivers of periphyton biomass in karstic wetland systems within the northern Caribbean region. ^ A total of 146 diatom species representing 39 genera were recorded from the three Caribbean locations, including a distinct core group of species that may be endemic to this habitat type. Weighted averaging models were produced that effectively predicted mat TP concentration from diatom assemblages for both Everglades (R2=0.56) and Caribbean (R2=0.85) locations. There were, however, significant differences among Everglades and Caribbean locations with respect to species TP optima and indicator species. This suggests that although diatoms are effective indicators of water quality in these wetlands, differences in species response to water quality changes can reduce the predictive power of these indices when applied across systems. ^
Resumo:
Despite a considerable progress in developing and testing psychosocial treatments to reduce youth anxiety disorders, much remains to learn about the relation between anxiety symptom reduction and change in youth functional impairment. The specific aims of this dissertation thus were to examine: (1) the relation between different levels of anxiety and youth functional impairment ratings; (2) incremental validity of the Children Global Assessment Scale (CGAS); (3) the mediating role of anxiety symptom reduction on youth functional impairment ratings; (4) the directionality of change between anxiety symptom reduction and youth functional impairment; (5) the moderating effects of youth age, sex, and ethnicity on the mediated relation between youth anxiety symptom reduction and change in functional impairment; and (6) an agreement (or lack thereof) between youths and their parents in their views of change in youth functional impairment vis-à-vis anxiety symptom reduction. ^ The results were analyzed using archival data set acquired from 183 youths and their mothers. Research questions were tested using SPSS and structural equation modeling techniques in Mplus. ^ The results supported the efficacy of psychosocial treatments to reduce the severity of youth anxiety symptoms and its associated functional impairment. Moreover, the results revealed that at posttreatment, youths who scored either low or medium on anxiety levels scored significantly lower on impairment, than youths who scored high on anxiety levels. Incremental validity of the CGAS was also revealed across all assessment points and informants in my sample. In addition, the results indicated the mediating role of anxiety symptom reduction with respect to change in youth functional impairment at posttest, regardless of the youth’s age, sex, and ethnicity. No significant findings were observed with regard to the bidirectionality and an informant disagreement vis-à-vis the relation between anxiety symptom reduction and change in functional impairment. ^ The study’s main contributions and potential implications on theoretical, empirical, and clinical levels are further discussed. The emphasis is on the need to enhance existing evidence-based treatments and develop innovative treatment models that will not only reduce youth’s symptoms (such anxiety) but also evoke genuine and palpable improvements in lives of youths and their families.^
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Within the Stage II program evaluation of the Miami Youth Development Project's (YDP) Changing Lives Program (CLP), this study evaluated CLP intervention effectiveness in promoting positive change in emotion-focused identity exploration (i.e. feelings of personal expressiveness; PE) and a "negative" symptom of identity development (i.e. identity distress; ID) as a first step toward the investigation of a self-transformative model of identity development in adolescent youth. Using structural equation modeling techniques, this study found that participation in the CLP is associated with positive changes in PE (path = .841, p < .002), but not changes in ID. Increase in ID scores was found to be associated with increases in PE (path = .229, p < .002), as well. Intervention effects were not moderated by age/stage, gender, or ethnicity, though differences were found in the degree to which participating subgroups (African-American/Hispanic, male/female, 14-16 years old/17-19 years old) experience change in PE and ID. Findings also suggest that moderate levels of ID may not be deleterious to identity exploration and may be associated with active exploration. ^
Resumo:
Within the Stage II program evaluation of the Miami Youth Development Project's (YDP) Changing Lives Program (CLP), this study evaluated CLP intervention effectiveness in promoting positive change in emotion-focused identity exploration (i.e. feelings of personal expressiveness; PE) and a "negative" symptom of identity development (i.e. identity distress; ID) as a first step toward the investigation of a self-transformative model of identity development in adolescent youth. Using structural equation modeling techniques, this study found that participation in the CLP is associated with positive changes in PE (path = .841, p < .002), but not changes in ID. Increase in ID scores was found to be associated with increases in PE (path = .229, p < .002), as well. Intervention effects were not moderated by age/stage, gender, or ethnicity, though differences were found in the degree to which participating subgroups (African- American/Hispanic, male/female, 14-16 years old/17-19 years old) experience change in PE and ID. Findings also suggest that moderate levels of ID may not be deleterious to identity exploration and may be associated with active exploration.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.
Resumo:
The purpose of the current study was to attempt to model various cognitive and social processes that are believed to lead to false confessions. More specifically, this study manipulated the variables of experimenter expectancy, guilt-innocence of the suspect, and interrogation techniques using the Russano et al. (2005) paradigm. The primary measure of interest was the likelihood of the participant signing the confession statement. By manipulating experimenter expectancy, the current study sought to further explore the social interactions that may occur in the interrogation room. In addition, in past experiments, the interrogator has typically been restricted to the use of one or two interrogation techniques. In the present study, interrogators were permitted to select from 15 different interrogation techniques when attempting to solicit a confession from participants. ^ Consistent with Rusanno et al. (2005), guilty participants (94%) were more likely to confess to the act of cheating than innocent participants (31%). The variable of experimenter expectancy did not effect confessions rates, length of interrogation, or the type of interrogation techniques used. Path analysis revealed feelings of pressure and the weighing of consequences on the part of the participant were associated with the signing of the confession statement. The findings suggest the guilt/innocence of the participant, the participant's perceptions of the interrogation situation, and length of interrogation play a pivotal role in the signing of the confession statement. Further examination of these variables may provide researchers with a better understanding of the relationship between interrogations and confessions. ^
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
Resumo:
The Pleistocene carbonate rock Biscayne Aquifer of south Florida contains laterally-extensive bioturbated ooltic zones characterized by interconnected touching-vug megapores that channelize most flow and make the aquifer extremely permeable. Standard petrophysical laboratory techniques may not be capable of accurately measuring such high permeabilities. Instead, innovative procedures that can measure high permeabilities were applied. These fragile rocks cannot easily be cored or cut to shapes convenient for conducting permeability measurements. For the laboratory measurement, a 3D epoxy-resin printed rock core was produced from computed tomography data obtained from an outcrop sample. Permeability measurements were conducted using a viscous fluid to permit easily observable head gradients (~2 cm over 1 m) simultaneously with low Reynolds number flow. For a second permeability measurement, Lattice Boltzmann Method flow simulations were computed on the 3D core renderings. Agreement between the two estimates indicates an accurate permeability was obtained that can be applied to future studies.
Resumo:
A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^
Resumo:
This dissertation establishes the foundation for a new 3-D visual interface integrating Magnetic Resonance Imaging (MRI) to Diffusion Tensor Imaging (DTI). The need for such an interface is critical for understanding brain dynamics, and for providing more accurate diagnosis of key brain dysfunctions in terms of neuronal connectivity. ^ This work involved two research fronts: (1) the development of new image processing and visualization techniques in order to accurately establish relational positioning of neuronal fiber tracts and key landmarks in 3-D brain atlases, and (2) the obligation to address the computational requirements such that the processing time is within the practical bounds of clinical settings. The system was evaluated using data from thirty patients and volunteers with the Brain Institute at Miami Children's Hospital. ^ Innovative visualization mechanisms allow for the first time white matter fiber tracts to be displayed alongside key anatomical structures within accurately registered 3-D semi-transparent images of the brain. ^ The segmentation algorithm is based on the calculation of mathematically-tuned thresholds and region-detection modules. The uniqueness of the algorithm is in its ability to perform fast and accurate segmentation of the ventricles. In contrast to the manual selection of the ventricles, which averaged over 12 minutes, the segmentation algorithm averaged less than 10 seconds in its execution. ^ The registration algorithm established searches and compares MR with DT images of the same subject, where derived correlation measures quantify the resulting accuracy. Overall, the images were 27% more correlated after registration, while an average of 1.5 seconds is all it took to execute the processes of registration, interpolation, and re-slicing of the images all at the same time and in all the given dimensions. ^ This interface was fully embedded into a fiber-tracking software system in order to establish an optimal research environment. This highly integrated 3-D visualization system reached a practical level that makes it ready for clinical deployment. ^