886 resultados para Critical factors of success
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
Hospitals and healthcare facilities in the United States are facing serious shortages of medical laboratory personnel, which, if not addressed, stand to negatively impact patient care. The problem is compounded by a reduction in the numbers of academic programs and resulting decrease in the number of graduates to keep up with the increase in industry demands. Given these challenges, the purpose of this study was to identify predictors of success for students in a selected 2-year Medical Laboratory Technology Associate in Science Degree Program. ^ This study examined five academic factors (College Placement Test Math and Reading scores, Cumulative GPA, Science GPA, and Professional [first semester laboratory courses] GPA) and, demographic data to see if any of these factors could predict program completion. The researcher examined academic records for a 10-year period (N =158). Using a retrospective model, the correlational analysis between the variables and completion revealed a significant relationship (p < .05) for CGPA, SGPA, CPT Math, and PGPA indicating that students with higher CGPA, SGPA, CPT Math, and PGPA were more likely to complete their degree in 2 years. Binary logistic regression analysis with the same academic variables revealed PGPA was the best predictor of program completion (p < .001). ^ Additionally, the findings in this study are consistent with the academic part of the Bean and Metzner Conceptual Model of Nontraditional Student Attrition which points to academic outcome variables such as GPA as affecting attrition. Thus, the findings in this study are important to students and educators in the field of Medical Laboratory Technology since PGPA is a predictor that can be used to provide early in-program intervention to the at-risk student, thus increasing the chances of successful timely completion.^
Resumo:
For the last three decades, the Capital Asset Pricing Model (CAPM) has been a dominant model to calculate expected return. In early 1990% Fama and French (1992) developed the Fama and French Three Factor model by adding two additional factors to the CAPM. However even with these present models, it has been found that estimates of the expected return are not accurate (Elton, 1999; Fama &French, 1997). Botosan (1997) introduced a new approach to estimate the expected return. This approach employs an equity valuation model to calculate the internal rate of return (IRR) which is often called, 'implied cost of equity capital" as a proxy of the expected return. This approach has been gaining in popularity among researchers. A critical review of the literature will help inform hospitality researchers regarding the issue and encourage them to implement the new approach into their own studies.
Resumo:
A previous genome-wide association study (GWAS) of more than 100,000 individuals identified molecular-genetic predictors of educational attainment. We undertook in-depth life-course investigation of the polygenic score derived from this GWAS using the four-decade Dunedin Study (N = 918). There were five main findings. First, polygenic scores predicted adult economic outcomes even after accounting for educational attainments. Second, genes and environments were correlated: Children with higher polygenic scores were born into better-off homes. Third, children's polygenic scores predicted their adult outcomes even when analyses accounted for their social-class origins; social-mobility analysis showed that children with higher polygenic scores were more upwardly mobile than children with lower scores. Fourth, polygenic scores predicted behavior across the life course, from early acquisition of speech and reading skills through geographic mobility and mate choice and on to financial planning for retirement. Fifth, polygenic-score associations were mediated by psychological characteristics, including intelligence, self-control, and interpersonal skill. Effect sizes were small. Factors connecting DNA sequence with life outcomes may provide targets for interventions to promote population-wide positive development.
Resumo:
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
Resumo:
We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.
Resumo:
Advances in digital photography and distribution technologies enable many people to produce and distribute images of their sex acts. When teenagers do this, the photos and videos they create can be legally classified as child pornography since the law makes no exception for youth who create sexually explicit images of themselves. The dominant discussions about teenage girls producing sexually explicit media (including sexting) are profoundly unproductive: (1) they blame teenage girls for creating private images that another person later maliciously distributed and (2) they fail to respect—or even discuss—teenagers’ rights to freedom of expression. Cell phones and the internet make producing and distributing images extremely easy, which provide widely accessible venues for both consensual sexual expression between partners and for sexual harassment. Dominant understandings view sexting as a troubling teenage trend created through the combination of camera phones and adolescent hormones and impulsivity, but this view often conflates consensual sexting between partners with the malicious distribution of a person’s private image as essentially equivalent behaviors. In this project, I ask: What is the role of assumptions about teen girls’ sexual agency in these problematic understandings of sexting that blame victims and deny teenagers’ rights? In contrast to the popular media panic about online predators and the familiar accusation that youth are wasting their leisure time by using digital media, some people champion the internet as a democratic space that offers young people the opportunity to explore identities and develop social and communication skills. Yet, when teen girls’ sexuality enters this conversation, all this debate and discussion narrows to a problematic consensus. The optimists about adolescents and technology fall silent, and the argument that media production is inherently empowering for girls does not seem to apply to a girl who produces a sexually explicit image of herself. Instead, feminist, popular, and legal commentaries assert that she is necessarily a victim: of a “sexualized” mass media, pressure from her male peers, digital technology, her brain structures or hormones, or her own low self-esteem and misplaced desire for attention. Why and how are teenage girls’ sexual choices produced as evidence of their failure or success in achieving Western liberal ideals of self-esteem, resistance, and agency? Since mass media and policy reactions to sexting have so far been overwhelmingly sexist and counter-productive, it is crucial to interrogate the concepts and assumptions that characterize mainstream understandings of sexting. I argue that the common sense that is co-produced by law and mass media underlies the problematic legal and policy responses to sexting. Analyzing a range of nonfiction texts including newspaper articles, talk shows, press releases, public service announcements, websites, legislative debates, and legal documents, I investigate gendered, racialized, age-based, and technologically determinist common sense assumptions about teenage girls’ sexual agency. I examine the consensus and continuities that exist between news, nonfiction mass media, policy, institutions, and law, and describe the limits of their debates. I find that this early 21st century post-feminist girl-power moment not only demands that girls live up to gendered sexual ideals but also insists that actively choosing to follow these norms is the only way to exercise sexual agency. This is the first study to date examining the relationship of conventional wisdom about digital media and teenage girls’ sexuality to both policy and mass media.
Resumo:
The treatment of subglottic stenosis in children remains a challenge for the otolaryngologist and may involve procedures such as endoscopy, open surgery, and often both. In the recent past, high-pressure balloons have been used in endoscopic treatment due to their relative facility and high success rates. To report success rates in the treatment of acquired subglottic stenosis with balloon laryngoplasty in children and identify predictive factors for the success of the technique and its complications. Descriptive, prospective study of children who were diagnosed with acquired subglottic stenosis and underwent balloon laryngoplasty as the primary treatment. Balloon laryngoplasty was performed in 48 children with an average age of 20.7 months: 31 presented with chronic subglottic stenosis and 17 with acute stenosis. Success rate was 100% for acute and 39% for chronic subglottic stenosis. Success was significantly associated with several factors, including recently acquired stenosis, initial grade of stenosis, younger patient age, and the absence of tracheotomy. Complications were transitory dysphagia observed in three children and a submucosal cyst in one of the patients. Balloon laryngoplasty may be considered as a first line of treatment for acquired subglottic stenosis. In acute cases, the success rate was 100%, and even though results are less promising in chronic cases, complications were not significant and the patients can undergo open surgery without contraindications. Predictive factors of success were acute stenosis, less severe grades of stenosis, younger patient age, and the absence of tracheotomy.
Resumo:
The aim of this study was to assess the prevalence and risk factors of apical periodontitis in endodontically treated teeth in a selected population of Brazilian adults. A total of 1,372 periapical radiographs of endodontically treated teeth were analyzed based on the quality of root filling, status of coronal restoration and presence of posts associated with apical periodontitis (AP). Data were analyzed statistically using odds ratio, confidence intervals and chi-square test. The prevalence of AP with adequate endodontic treatment was low (16.5%). This percentage dropped to 12.1% in cases with adequate root filling and adequate coronal restoration. Teeth with adequate endodontic treatment and poor coronal restoration had an AP prevalence of 27.9%. AP increased to 71.7% in teeth with poor endodontic treatment associated with poor coronal restoration. When poor endodontic treatment was combined with adequate coronal restoration, AP prevalence was 61.8%. The prevalence of AP was low when associated with high technical quality of root canal treatment. Poor coronal restoration increased the risk of AP even when endodontic treatment was adequate (OR=2.80; 95%CI=1.87-4.22). The presence of intracanal posts had no influence on AP prevalence.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency's technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
We have considered a Bose gas in an anisotropic potential. Applying the the Gross-Pitaevskii Equation (GPE) for a confined dilute atomic gas, we have used the methods of optimized perturbation theory and self-similar root approximants, to obtain an analytical formula for the critical number of particles as a function of the anisotropy parameter for the potential. The spectrum of the GPE is also discussed.
Resumo:
FAPESP n. 03/04061-2
Resumo:
Background Associations between aplastic anemia and numerous drugs, pesticides and chemicals have been reported. However, at least 50% of the etiology of aplastic anemia remains unexplained. Design and Methods This was a case-control, multicenter, multinational study, designed to identify risk factors for agranulocytosis and aplastic anemia. The cases were patients with diagnosis of aplastic anemia confirmed through biopsy or bone marrow aspiration, selected through an active search of clinical laboratories, hematology clinics and medical records. The controls did not have either aplastic anemia or chronic diseases. A total of 224 patients with aplastic anemia were included in the study, each case was paired with four controls, according to sex, age group, and hospital where the case was first seen. Information was collected on demographic data, medical history, laboratory tests, medications, and other potential risk factors prior to diagnosis. Results The incidence of aplastic anemia was 1.6 cases per million per year. Higher rates of benzene exposure (>= 30 exposures per year) were associated with a greater risk of aplastic anemia (odds ratio, OR: 4.2; 95% confidence interval, CI: 1.82-9.82). Individuals exposed to chloramphenicol in the previous year had an adjusted OR for aplastic anemia of 8.7 (CI: 0.87-87.93) and those exposed to azithromycin had an adjusted OR of 11.02 (CI 1.14-108.02). Conclusions The incidence of aplastic anemia in Latin America countries is low. Although the research study centers had a high coverage of health services, the underreporting of cases of aplastic anemia in selected regions can be discussed. Frequent exposure to benzene-based products increases the risk for aplastic anemia. Few associations with specific drugs were found, and it is likely that some of these were due to chance alone.
Resumo:
Objectives: To evaluate the intratumoral reliability of color Doppler parameters and the contribution of Doppler sonography to the gray-scale differential diagnosis of ovarian masses. Methods: An observational study was performed including 67 patients, 15 (22.4%) with malignant ovarian neoplasm and 52 (77.6%) with benign ovarian diseases. We performed the Doppler evaluation in two distinct vessels selected after decreasing the Doppler gain to sample only vessels with higher velocity flow. Doppler measurements were obtained from each identified vessel, and resistive index (RI), pulsatility index (PI), peak systolic velocity (PSV), and end-diastolic velocity (EDV) were measured. Intraclass coefficient of correlation (ICC), sensitivity, specificity, and potential improvement in gray-scale ultrasound performance were calculated. Results: The general ICC were 0.60 (95% CI 0.42- 0.73) for RI, 0.65 (95% CI 0.49- 0.77) for PI, 0.07 (95% CI- 0.17-0.30) for PSV, and 0.19 (95% CI -0.05-0.41) for EDV. The sensitivity and specificity were respectively 84.6% and 86.7% for RI, 69.2% and 93.3% for PI, 80.0% and 65.4% for gray-scale sonography, and 93.3% and 65.4% for gray-scale plus RI (p = 0.013). Conclusions: Gynecologists must be careful in interpreting results from Doppler evaluation of ovarian masses because PSV and EDV present poor intratumoral reliability. The lower RI value, evaluated in at least two distinct sites of the tumor, was able to improve the performance of gray-scale ultrasound in differential diagnosis of ovarian masses.
Resumo:
Aim: To identify predictive factors associated with non-deterioration of glucose metabolism following a 2-year behavioral intervention in Japanese-Brazilians. Methods: 295 adults (59.7% women) without diabetes completed 2-year intervention program. Characteristics of those who maintained/improved glucose tolerance status (non-progressors) were compared with those who worsened (progressors) after the intervention. In logistic regression analysis, the condition of non-progressor was used as dependent variable. Results: Baseline characteristics of non-progressors (71.7%) and progressors were similar, except for the former being younger and having higher frequency of disturbed glucose tolerance and lower C-reactive protein (CRP). In logistic regression, non-deterioration of glucose metabolism was associated with disturbed glucose tolerance impaired fasting glucose or impaired glucose tolerance - (p < 0.001) and CRP levels <= 0.04 mg/dL (p = 0.01), adjusted for age and anthropometric variables. Changes in anthropometry and physical activity and achievement of weight and dietary goals after intervention were similar in subsets that worsened or not the glucose tolerance status. Conclusion: The whole sample presented a homogeneous behavior during the intervention. Lower CRP levels and diagnosis of glucose intolerance at baseline were predictors of non-deterioration of the glucose metabolism after a relatively simple intervention, independent of body adiposity.