900 resultados para Hierarchy of text classifiers
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
Memory (cache, DRAM, and disk) is in charge of providing data and instructions to a computer's processor. In order to maximize performance, the speeds of the memory and the processor should be equal. However, using memory that always match the speed of the processor is prohibitively expensive. Computer hardware designers have managed to drastically lower the cost of the system with the use of memory caches by sacrificing some performance. A cache is a small piece of fast memory that stores popular data so it can be accessed faster. Modern computers have evolved into a hierarchy of caches, where a memory level is the cache for a larger and slower memory level immediately below it. Thus, by using caches, manufacturers are able to store terabytes of data at the cost of cheapest memory while achieving speeds close to the speed of the fastest one.^ The most important decision about managing a cache is what data to store in it. Failing to make good decisions can lead to performance overheads and over-provisioning. Surprisingly, caches choose data to store based on policies that have not changed in principle for decades. However, computing paradigms have changed radically leading to two noticeably different trends. First, caches are now consolidated across hundreds to even thousands of processes. And second, caching is being employed at new levels of the storage hierarchy due to the availability of high-performance flash-based persistent media. This brings four problems. First, as the workloads sharing a cache increase, it is more likely that they contain duplicated data. Second, consolidation creates contention for caches, and if not managed carefully, it translates to wasted space and sub-optimal performance. Third, as contented caches are shared by more workloads, administrators need to carefully estimate specific per-workload requirements across the entire memory hierarchy in order to meet per-workload performance goals. And finally, current cache write policies are unable to simultaneously provide performance and consistency guarantees for the new levels of the storage hierarchy.^ We addressed these problems by modeling their impact and by proposing solutions for each of them. First, we measured and modeled the amount of duplication at the buffer cache level and contention in real production systems. Second, we created a unified model of workload cache usage under contention to be used by administrators for provisioning, or by process schedulers to decide what processes to run together. Third, we proposed methods for removing cache duplication and to eliminate wasted space because of contention for space. And finally, we proposed a technique to improve the consistency guarantees of write-back caches while preserving their performance benefits.^
Resumo:
The National System for the Integral Development of the Family (DIF) in Mexico assists children in orphanages. This paper provides an overview of its current practices, and advocates a holistic educational/social model for “alternative orphanages,” integrating Maslow’s Hierarchy of Needs and the rights-based approach. The model complements DIF’s social efforts.
Resumo:
The dilemma of securing a special education teacher supply is a critical issue. Understanding causes of attrition is vital to addressing the problem. This review analyzes literature and identifies factors for teacher retention/attrition while overlaying the concept of Maslow’s hierarchy of needs to understand this phenomenon from a psychological perspective.
Resumo:
The study explores the framing of the Chicago teachers strike in the media. The study uses content analysis of text from major media sources to find major themes in frames, theorize their purpose, and explore the reaction to them of teachers as public intellectuals.
Resumo:
Reading deficits in students in Grades 4 to 12 are evident in American schools. Informational text is particularly difficult for students. This quasi-experimental study (N=138) investigated sixth-grade students' achievement in social studies using the Reciprocal Mapping instructional routine, compared to sixth-grade students' achievement taught with a traditional approach. The Reciprocal Mapping instructional routine incorporated explicit instruction in text structure using graphic organizers. Students created their own graphic organizers and used them to write about social studies content. The comparison group used a traditional approach, students' reading the textbook and answering questions. Students for this study included sixth-graders in the seven sixth-grade classrooms in two public schools in a small, rural south Florida school district. A focus of this study was to determine the helpfulness of the intervention for at-risk readers. To determine students considered to be at-risk, the researcher used data from the reading portion of the Florida Comprehensive Assessment Test (FCAT), 2011-2012, that considers Level 1 and 2 as at-risk readers. The quasi-experimental study used a pretest-posttest control group design, with students assigned to treatment groups by class. Two teachers at the two rural sites were trained on the Reciprocal Mapping instructional routine and taught students in both the experimental and control groups for an equivalent amount of time over a 5-week period. Results of the 3 x 2 factorial ANCOVA found a significant positive difference favoring the experimental group's social studies achievement as compared to that of the comparison group as measured by the pre/post unit test from the social studies series (McGraw-Hill, 2013), when controlling for initial differences in students' reading FCAT scores. Interactions for high-risk struggling readers were investigated using the significance level p < .05. Due to no significant interaction the main effects of treatment were interpreted. The pretest was used as a covariate and the multivariate analysis was found to be significant. Therefore, analysis of covariance was run on each of the dependent variable as a follow-up. Reciprocal Mapping was found to be significant in posttest scores, independent of gender and level of risk, and while holding the pretest scores constant. Findings showed there was a significant difference in the performance of the high-risk reading students taught with the Reciprocal Mapping intervention who scored statistically better than students in the control group. Further study findings showed that teacher fidelity of implementation of the treatment had a statistically significant relationship in predicting posttest scores when controlling for pretest scores. Study results indicated that improving students' use of text structure through the Reciprocal Mapping instructional routine positively supported sixth-grade students' social studies achievement.
Resumo:
This article argues for a political transformation and reorganization of the university so that it is capable of challenging the "hierarchy of power in a neoliberal society." Faculty democracy, administrative accountability to faculty, and the education of students to become critical, thinking citizens would be a major part of this reorganization. This article first appeared in The Contemporary Condition: http://contemporarycondition.blogspot.com/2014/07/toward-eco-egalitarian-university.html
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
The focus of this thesis is placed on text data compression based on the fundamental coding scheme referred to as the American Standard Code for Information Interchange or ASCII. The research objective is the development of software algorithms that result in significant compression of text data. Past and current compression techniques have been thoroughly reviewed to ensure proper contrast between the compression results of the proposed technique with those of existing ones. The research problem is based on the need to achieve higher compression of text files in order to save valuable memory space and increase the transmission rate of these text files. It was deemed necessary that the compression algorithm to be developed would have to be effective even for small files and be able to contend with uncommon words as they are dynamically included in the dictionary once they are encountered. A critical design aspect of this compression technique is its compatibility to existing compression techniques. In other words, the developed algorithm can be used in conjunction with existing techniques to yield even higher compression ratios. This thesis demonstrates such capabilities and such outcomes, and the research objective of achieving higher compression ratio is attained.
Resumo:
Background: Evidence-based medication and lifestyle modification are important for secondary prevention of cardiovascular disease but are underutilized. Mobile health strategies could address this gap but existing evidence is mixed. Therefore, we piloted a pre-post study to assess the impact of patient-directed text messages as a means of improving medication adherence and modifying major health risk behaviors among coronary heart disease (CHD) patients in Hainan, China.
Methods: 92 CVD patients were surveyed between June and August 2015 (before the intervention) and then between October and December 2015 (after 12 week intervention) about (a) medication use (b) smoking status,(c) fruit and vegetable consumption, and (d) physical activity uptake. Acceptability of text-messaging intervention was assessed at follow-up. Descriptive statistics, along with paired comparisons between the pre and post outcomes were conducted using both parametric (t-test) and non-parametric (Wilcoxon signed rank test) methods.
Results: The number of respondents at follow-up was 82 (89% retention rate). Significant improvements were observed for medication adherence (P<0.001) and for the number of cigarettes smoked per day (P=.022). However there was no change in the number of smokers who quitted smoking at follow-up. There were insignificant changes for physical activity (P=0.91) and fruit and vegetable consumption.
Resumo:
Sexual harassment at work is a form of gender violence barely made visible but still present in labor organizations, where it keeps generating high levels of suffering, discrimination and inequality mainly affecting women. To address it properly it is necessary an organizational change towards equity arising from the knowledge of the subjective meanings that stakeholders (staff, union representatives, employers, public administration, etc.) attribute to that reality. In this article we present the main findings of a qualitative study on the social perception of sexual harassment. The work highlights the existence of many strategies aimed at legitimize and minimize the relevance of the problem, blaming the victim, justifying the lack of support from the environment and / or the involvement of the organization in the solutions. Among the conclusions we underline the need for new models of business management involving all stakeholders in the prevention and control of the in a responsible way.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
Marine heatwaves (MHWs) have been observed around the world and are expected to increase in intensity and frequency under anthropogenic climate change. A variety of impacts have been associated with these anomalous events, including shifts in species ranges, local extinctions and economic impacts on seafood industries through declines in important fishery species and impacts on aquaculture. Extreme temperatures are increasingly seen as important influences on biological systems, yet a consistent definition of MHWs does not exist. A clear definition will facilitate retrospective comparisons between MHWs, enabling the synthesis and a mechanistic understanding of the role of MHWs in marine ecosystems. Building on research into atmospheric heatwaves, we propose both a general and specific definition for MHWs, based on a hierarchy of metrics that allow for different data sets to be used in identifying MHWs. We generally define a MHW as a prolonged discrete anomalously warm water event that can be described by its duration, intensity, rate of evolution, and spatial extent. Specifically, we consider an anomalously warm event to be a MHW if it lasts for five or more days, with temperatures warmer than the 90th percentile based on a 30-year historical baseline period. This structure provides flexibility with regard to the description of MHWs and transparency in communicating MHWs to a general audience. The use of these metrics is illustrated for three 21st century MHWs; the northern Mediterranean event in 2003, the Western Australia ‘Ningaloo Niño’ in 2011, and the northwest Atlantic event in 2012. We recommend a specific quantitative definition for MHWs to facilitate global comparisons and to advance our understanding of these phenomena.