15 resultados para Equipment Failure Analysis
em Digital Commons at Florida International University
Resumo:
This thesis develops and validates the framework of a specialized maintenance decision support system for a discrete part manufacturing facility. Its construction utilizes a modular approach based on the fundamental philosophy of Reliability Centered Maintenance (RCM). The proposed architecture uniquely integrates System Decomposition, System Evaluation, Failure Analysis, Logic Tree Analysis, and Maintenance Planning modules. It presents an ideal solution to the unique maintenance inadequacies of modern discrete part manufacturing systems. Well established techniques are incorporated as building blocks of the system's modules. These include Failure Mode Effect and Criticality Analysis (FMECA), Logic Tree Analysis (LTA), Theory of Constraints (TOC), and an Expert System (ES). A Maintenance Information System (MIS) performs the system's support functions. Validation was performed by field testing of the system at a Miami based manufacturing facility. Such a maintenance support system potentially reduces downtime losses and contributes to higher product quality output. Ultimately improved profitability is the final outcome. ^
Resumo:
Being at-risk is a growing problem in the U.S. because of disturbing societal trends such as unemployment, divorce, substance abuse, child abuse and neglect, and the new threat of terrorist violence. Resilience characterizes individuals who rebound from or adapt to adversities such as these, and academic resilience distinguishes at-risk students who succeed in school despite hardships. ^ The purpose of this research was to perform a meta-analysis to examine the power of resilience and to suggest ways educators might improve academic resilience, which was operationalized by satisfactory test scores and grades. In order to find all studies that were relevant to academic resilience in at-risk kindergarten through 12th-grade students, extensive electronic and hardcopy searches were conducted, and these resulted in a database of 421 articles. Two hundred eighty seven of these were rejected quickly, because they were not empirical research. Upon further examination, another 106 were rejected for not meeting study protocol criteria. Ultimately, 28 studies were coded for study level descriptors and effect size variables. ^ Protective factors for resilience were found to originate in physical, psychological, and behavioral domains on proximal/intraindividual, transitional/intrafamilial, or distal/extrafamilial levels. Effect sizes (ESs) for these were weighted and the means for each level or category were interpreted by commonly accepted benchmarks. Mean effect sizes for proximal (M = .27) and for transitional (M = .15) were small but significant. The mean effect size for the distal level was insignificant. This supported the hypotheses that the proximal level was the source of most protective factors for academic resilience in at-risk students followed by the transitional level. The distal effect size warranted further research particularly in light of the small number of studies (n = 11) contributing effect sizes to that category. A homogeneity test indicated a search for moderators, i.e., study variables affecting outcomes, was justified. “Category” was the largest moderator. Graphs of weighted mean effect sizes in the physical, psychological, and behavioral domains were plotted for each level to better illustrate the findings of the meta-analysis. Suggestions were made for combining resilience development with aspects of positive psychology to promote resilience in the schools. ^
Resumo:
To promote regional or mutual improvement, numerous interjurisdictional efforts to share tax bases have been attempted. Most of these efforts fail to be consummated. Motivations to share revenues include: narrowing fiscal disparities, enhancing regional cooperation and economic development, rationalizing land-use, and minimizing revenue losses caused by competition to attract and keep businesses. Various researchers have developed theories to aid understanding of why interjurisdictional cooperation efforts succeed or fail. Walter Rosenbaum and Gladys Kammerer studied two contemporaneous Florida local-government consolidation attempts. Boyd Messinger subsequently tested their Theory of Successful Consolidation on nine consolidation attempts. Paul Peterson's dual theories on Modern Federalism posit that all governmental levels attempt to further economic development and that politicians act in ways that either further their futures or cement job security. Actions related to the latter theory often interfere with the former. Samuel Nunn and Mark Rosentraub sought to learn how interjurisdictional cooperation evolves. Through multiple case studies they developed a model framing interjurisdictional cooperation in four dimensions. ^ This dissertation investigates the ability of the above theories to help predict success or failure of regional tax-base revenue sharing attempts. A research plan was formed that used five sequenced steps to gather data, analyze it, and conclude if hypotheses concerning the application of these theories were valid. The primary analytical tools were: multiple case studies, cross-case analysis, and pattern matching. Data was gathered from historical records, questionnaires, and interviews. ^ The results of this research indicate that Rosenbaum-Kammerer theory can be a predictor of success or failure in implementing tax-base revenue sharing if it is amended as suggested by Messinger and further modified by a recommendation in this dissertation. Peterson's Functional and Legislative theories considered together were able to predict revenue sharing proposal outcomes. Many of the indicators of interjurisdictional cooperation forwarded in the Nunn-Rosentraub model appeared in the cases studied, but the model was not a reliable forecasting instrument. ^
Resumo:
Private nonprofit human service organizations provide a spectrum of services that aim to resolve societal problems. Their failure may leave needed and desired services unprovided or not provided sufficiently to meet public demand. However, the concept of organizational failure has not been examined for the nonprofit organization. This research addresses the deficiency in the literatures of organization failure and nonprofit organizations.^ An eight category typology, developed from a review of the current literature and findings from expert interviews, is initially presented to define nonprofit organization failure. A multiple case study design is used to test the typology in four nonprofit human service delivery agencies. The case analysis reduces the typology to five types salient to nonprofit organization failure: input failure, legitimacy failure, adaptive failure, management failure and leadership failure.^ The resulting five category typology is useful to both theory builders and nonprofit practitioners. For theory development, the interaction of the failure types extends the literature and lays a foundation for a theory of nonprofit organization failure that diffuses management and leadership across all of the failure types, highlights management and leadership failure as collective functions shared by paid staff and the volunteer board of directors, and emphasizes the importance of organization legitimacy.^ From a practical perspective, the typology provides a tool for diagnosing failure in the nonprofit organization. Using the management indicators developed for the typology, a checklist of the warning signals of potential failure, emphasizing the key types of management and leadership, offers nonprofit decision makers a priori examination of an organization's propensity for failure. ^
Resumo:
The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand.^ Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century. ^
Resumo:
Expositions of student work at the end of the extended school year are one of many reform efforts in a specially formed School Improvement Zone in Miami Dade schools. This descriptive analysis offers examples of successful attempts to engender pride even in the face of formidable social and cultural obstacles.
Resumo:
In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.
Resumo:
The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^
Resumo:
This qualitative study used grounded theory methods and purposeful sampling to explore perceptions on caring and being cared-for. Twenty-four adolescent male participants, identified as at-risk for school failure, completed a two phase interview process exploring these phenomena within three relationships; the relationship with the friend, with the most caring person they knew and with the teacher they felt cared for them. ^ Each participant was asked a predetermined set of open questions in an initial semi-structured interview. In addition each participant was encouraged to explore his own reflections on caring. A second interview allowed for member checking and for the participant to continue sharing his meaning of caring and being cared-for. ^ Line by line analysis with open, axial and selective coding was applied to interview transcripts along with a constant comparative method. Results indicated that the core category integrating all other categories was attachment bonding. Participants' stories manifested characteristics of proximity seeking, secure base, safe haven and distress upon involuntary separation from an attachment figure. ^ Strategies facilitating attachment bonding were influenced by the power positions of the relational players. Participants responded positively to the one-caring when they felt cared-for. Results further indicated that participants did not need to feel a sense of belonging in order to feel cared-for. Teacher behaviors indicating openness for authentic connections with students were specific to teacher's friendliness and professional competence. Teachers who nurtured feelings of being cared-for were uncommon in the participants' educational experience. ^ The number of adolescent males leaving high school prematurely is both a personal problem and a social problem. Despite a “mask” of indifference often exhibited by adolescent males at-risk for school failure, teachers might consider the social/emotional needs of these students when implementing the curriculum. In addition, policy makers might consider the social/emotional needs of this vulnerable population when developing programs meant to foster psychological well-being and connectedness for adolescent males at-risk for school failure. ^
Resumo:
NOTCH1 is a member of the NOTCH receptor family, a group of single-pass trans-membrane receptors. NOTCH signaling is highly conserved in evolution and mediates communication between adjacent cells. NOTCH receptors have been implicated in cell fate determination, as well as maintenance and differentiation of stem cells. In the mammalian testis expression of NOTCH1 in somatic and germ cells has been demonstrated, however its role in spermatogenesis was not clear. To study the significance of NOTCH1 in germ cells, we applied a cre/loxP approach in mice to induce NOTCH1 gain- or loss-of function specifically in male germ cells. Using a Stra8-icretransgene we produced mice with conditional activation of the NOTCH1 intracellular domain (NICD) in germ cells. Spermatogenesis in these mutants was progressively affected with age, resulting in decreased testis weight and sperm count. Analysis of downstream target genes of NOTCH1 signaling showed an increased expression of Hes5, with a reduction of the spermatogonial differentiation marker, Neurog3 expression in the mutant testis. Apoptosis was significantly increased in mouse germ cells with the corresponding elevation of pro-apoptotic Trp53 and Trp63genes' expression. We also showed that the conditional germ cell-specific ablation of Notch1 had no effect on spermatogenesis or male fertility. Our data suggest the importance of NOTCH signaling regulation in male germ cells for their survival and differentiation.
Resumo:
Background The etiology of most premature ovarian failure (POF) cases is usually elusive. Although genetic causes clearly exist and a likely susceptible region of 8q22.3 has been discovered, no predominant explanation exists for POF. More recently, evidences have indicated that mutations in NR5A1 gene could be causative for POF. We therefore screened for mutations in the NR5A1 gene in a large cohort of Chinese women with non-syndromic POF. Methods Mutation screening of NR5A1 gene was performed in 400 Han Chinese women with well-defined 46,XX idiopathic non-syndromic POF and 400 controls. Subsequently, functional characterization of the novel mutation identified was evaluated in vitro. Results A novel heterozygous missense mutation [c.13T>G (p.Tyr5Asp)] in NR5A1 was identified in 1 of 384 patients (0.26%). This mutation impaired transcriptional activation on Amh, Inhibin-a, Cyp11a1and Cyp19a1 gene, as shown by transactivation assays. However, no dominant negative effect was observed, nor was there impact on protein expression and nuclear localization. Conclusions This novel mutation p.Tyr5Asp, in a novel non-domain region, is presumed to result in haploinsufficiency. Irrespectively, perturbation in NR5A1 is not a common explanation for POF in Chinese.
Resumo:
The electronics industry, is experiencing two trends one of which is the drive towards miniaturization of electronic products. The in-circuit testing predominantly used for continuity testing of printed circuit boards (PCB) can no longer meet the demands of smaller size circuits. This has lead to the development of moving probe testing equipment. Moving Probe Test opens up the opportunity to test PCBs where the test points are on a small pitch (distance between points). However, since the test uses probes that move sequentially to perform the test, the total test time is much greater than traditional in-circuit test. While significant effort has concentrated on the equipment design and development, little work has examined algorithms for efficient test sequencing. The test sequence has the greatest impact on total test time, which will determine the production cycle time of the product. Minimizing total test time is a NP-hard problem similar to the traveling salesman problem, except with two traveling salesmen that must coordinate their movements. The main goal of this thesis was to develop a heuristic algorithm to minimize the Flying Probe test time and evaluate the algorithm against a "Nearest Neighbor" algorithm. The algorithm was implemented with Visual Basic and MS Access database. The algorithm was evaluated with actual PCB test data taken from Industry. A statistical analysis with 95% C.C. was performed to test the hypothesis that the proposed algorithm finds a sequence which has a total test time less than the total test time found by the "Nearest Neighbor" approach. Findings demonstrated that the proposed heuristic algorithm reduces the total test time of the test and, therefore, production cycle time can be reduced through proper sequencing.
Resumo:
Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.
Resumo:
The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand. Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century.
Resumo:
A class of lifetime distributions which has received considerable attention in modelling and analysis of lifetime data is the class of lifetime distributions with bath-tub shaped failure rate functions because of their extensive applications. The purpose of this thesis was to introduce a new class of bivariate lifetime distributions with bath-tub shaped failure rates (BTFRFs). In this research, first we reviewed univariate lifetime distributions with bath-tub shaped failure rates, and several multivariate extensions of a univariate failure rate function. Then we introduced a new class of bivariate distributions with bath-tub shaped failure rates (hazard gradients). Specifically, the new class of bivariate lifetime distributions were developed using the method of Morgenstern’s method of defining bivariate class of distributions with given marginals. The computer simulations and numerical computations were used to investigate the properties of these distributions.