865 resultados para layoff hazard rates


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a methodology and software for hazard rate analysis of induction type watt-hour meters, considering the main variables related with the degradation process of these meters, for the Elektro Electricity and Services SA. The modeling developed to calculate the watt-hour meters hazard rate was implemented in a tool through a user friendly platform, in Delphi language, enabling not only hazard rate analysis, but also a classification by risk range, localization of installation for the analyzed meters, and, allowing, through an expert system, the sampling of induction type watt-hour meters, based on the model risk developed with artificial intelligence, with the mainly goal of follow and manage the process of degradation, maintenance and replacement of these meters. © 2010 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Impaired cognition is an important dimension in psychosis and its at-risk states. Research on the value of impaired cognition for psychosis prediction in at-risk samples, however, mainly relies on study-specific sample means of neurocognitive tests, which unlike widely available general test norms are difficult to translate into clinical practice. The aim of this study was to explore the combined predictive value of at-risk criteria and neurocognitive deficits according to test norms with a risk stratification approach. Method: Potential predictors of psychosis (neurocognitive deficits and at-risk criteria) over 24 months were investigated in 97 at-risk patients. Results: The final prediction model included (1) at-risk criteria (attenuated psychotic symptoms plus subjective cognitive disturbances) and (2) a processing speed deficit (digit symbol test). The model was stratified into 4 risk classes with hazard rates between 0.0 (both predictors absent) and 1.29 (both predictors present). Conclusions: The combination of a processing speed deficit and at-risk criteria provides an optimized stratified risk assessment. Based on neurocognitive test norms, the validity of our proposed 3 risk classes could easily be examined in independent at-risk samples and, pending positive validation results, our approach could easily be applied in clinical practice in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The determination of size as well as power of a test is a vital part of a Clinical Trial Design. This research focuses on the simulation of clinical trial data with time-to-event as the primary outcome. It investigates the impact of different recruitment patterns, and time dependent hazard structures on size and power of the log-rank test. A non-homogeneous Poisson process is used to simulate entry times according to the different accrual patterns. A Weibull distribution is employed to simulate survival times according to the different hazard structures. The current study utilizes simulation methods to evaluate the effect of different recruitment patterns on size and power estimates of the log-rank test. The size of the log-rank test is estimated by simulating survival times with identical hazard rates between the treatment and the control arm of the study resulting in a hazard ratio of one. Powers of the log-rank test at specific values of hazard ratio (≠1) are estimated by simulating survival times with different, but proportional hazard rates for the two arms of the study. Different shapes (constant, decreasing, or increasing) of the hazard function of the Weibull distribution are also considered to assess the effect of hazard structure on the size and power of the log-rank test. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation studies newly founded U.S. firms' survival using three different releases of the Kauffman Firm Survey. I study firms' survival from a different perspective in each chapter. ^ The first essay studies firms' survival through an analysis of their initial state at startup and the current state of the firms as they gain maturity. The probability of survival is determined using three probit models, using both firm-specific variables and an industry scale variable to control for the environment of operation. The firm's specific variables include size, experience and leverage as a debt-to-value ratio. The results indicate that size and relevant experience are both positive predictors for the initial and current states. Debt appears to be a predictor of exit if not justified wisely by acquiring assets. As suggested previously in the literature, entering a smaller-scale industry is a positive predictor of survival from birth. Finally, a smaller-scale industry diminishes the negative effects of debt. ^ The second essay makes use of a hazard model to confirm that new service-providing (SP) firms are more likely to survive than new product providers (PPs). I investigate the possible explanations for the higher survival rate of SPs using a Cox proportional hazard model. I examine six hypotheses (variations in capital per worker, expenses per worker, owners' experience, industry wages, assets and size), none of which appear to explain why SPs are more likely than PPs to survive. Two other possibilities are discussed: tax evasion and human/social relations, but these could not be tested due to lack of data. ^ The third essay investigates women-owned firms' higher failure rates using a Cox proportional hazard on two models. I make use of a never-before used variable that proxies for owners' confidence. This variable represents the owners' self-evaluated competitive advantage. ^ The first empirical model allows me to compare women's and men's hazard rates for each variable. In the second model I successively add the variables that could potentially explain why women have a higher failure rate. Unfortunately, I am not able to fully explain the gender effect on the firms' survival. Nonetheless, the second empirical approach allows me to confirm that social and psychological differences among genders are important in explaining the higher likelihood to fail in women-owned firms.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background:Testosterone deficiency in patients with heart failure (HF) is associated with decreased exercise capacity and mortality; however, its impact on hospital readmission rate is uncertain. Furthermore, the relationship between testosterone deficiency and sympathetic activation is unknown.Objective:We investigated the role of testosterone level on hospital readmission and mortality rates as well as sympathetic nerve activity in patients with HF.Methods:Total testosterone (TT) and free testosterone (FT) were measured in 110 hospitalized male patients with a left ventricular ejection fraction < 45% and New York Heart Association classification IV. The patients were placed into low testosterone (LT; n = 66) and normal testosterone (NT; n = 44) groups. Hypogonadism was defined as TT < 300 ng/dL and FT < 131 pmol/L. Muscle sympathetic nerve activity (MSNA) was recorded by microneurography in a subpopulation of 27 patients.Results:Length of hospital stay was longer in the LT group compared to in the NT group (37 ± 4 vs. 25 ± 4 days; p = 0.008). Similarly, the cumulative hazard of readmission within 1 year was greater in the LT group compared to in the NT group (44% vs. 22%, p = 0.001). In the single-predictor analysis, TT (hazard ratio [HR], 2.77; 95% confidence interval [CI], 1.58–4.85; p = 0.02) predicted hospital readmission within 90 days. In addition, TT (HR, 4.65; 95% CI, 2.67–8.10; p = 0.009) and readmission within 90 days (HR, 3.27; 95% CI, 1.23–8.69; p = 0.02) predicted increased mortality. Neurohumoral activation, as estimated by MSNA, was significantly higher in the LT group compared to in the NT group (65 ± 3 vs. 51 ± 4 bursts/100 heart beats; p < 0.001).Conclusion:These results support the concept that LT is an independent risk factor for hospital readmission within 90 days and increased mortality in patients with HF. Furthermore, increased MSNA was observed in patients with LT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the workshop "Do Peroxisome Proliferating Compounds Pose a Hepatocarcinogenic Hazard to Humans?" was to provide a review of the current state of the science on the relationship between peroxisome proliferation and hepatocarcinogenesis. There has been much debate regarding the mechanism by which peroxisome proliferators may induce liver tumors in rats and mice and whether these events occur in humans. A primary goal of the workshop was to determine where consensus might be reached regarding the interpretation of these data relative to the assessment of potential human risks. A core set of biochemical and cellular events has been identified in the rodent strains that are susceptible to the hepatocarcinogenic effects of peroxisome proliferators, including peroxisome proliferation, increases in fatty acyl-CoA oxidase levels, microsomal fatty acid oxidation, excess production of hydrogen peroxide, increases in rates of cell proliferation, and expression and activation of the alpha subtype of the peroxisome proliferator-activated receptor (PPAR-alpha). Such effects have not been identified clinically in liver biopsies from humans exposed to peroxisome proliferators or in in vitro studies with human hepatocytes, although PPAR-alpha is expressed at a very low level in human liver. Consensus was reached regarding the significant intermediary roles of cell proliferation and PPAR-alpha receptor expression and activation in tumor formation. Information considered necessary for characterizing a compound as a peroxisome proliferating hepatocarcinogen include hepatomegaly, enhanced cell proliferation, and an increase in hepatic acyl-CoA oxidase and/or palmitoyl-CoA oxidation levels. Given the lack of genotoxic potential of most peroxisome proliferating agents, and since humans appear likely to be refractive or insensitive to the tumorigenic response, risk assessments based on tumor data may not be appropriate. However, nontumor data on intermediate endpoints would provide appropriate toxicological endpoints to determine a point of departure such as the LED10 or NOAEL which would be the basis for a margin-of-exposure (MOE) risk assessment approach. Pertinent factors to be considered in the MOE evaluation would include the slope of the dose-response curve at the point of departure, the background exposure levels, and variability in the human response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this study was to evaluate the clinical fracture rate of crowns fabricated with the pressable, leucite-reinforced ceramic IPS Empress, and relate the results to the type of tooth restored. Materials and Methods: The database SCOPUS was searched for clinical studies involving full-coverage crowns made of IPS Empress. To assess the fracture rate of the crowns in relation to the type of restored tooth and study, Poisson regression analysis was used. Results: Seven clinical studies were identified involving 1,487 adhesively luted crowns (mean observation time: 4.5 +/- 1.7 years) and 81 crowns cemented with zinc-phosphate cement (mean observation time: 1.6 +/- 0.8 years). Fifty-seven of the adhesively luted crowns fractured (3.8%). The majority of fractures (62%) occurred between the third and sixth year after placement. There was no significant influence regarding the test center on fracture rate, but the restored tooth type played a significant role. The hazard rate (per year) for crowns was estimated to be 5 in every 1,000 crowns for incisors, 7 in every 1,000 crowns for premolars, 12 in every 1,000 crowns for canines, and 16 in every 1,000 crowns for molars. One molar crown in the zinc-phosphate group fractured after 1.2 years. Conclusion: Adhesively luted IPS Empress crowns showed a low fracture rate for incisors and premolars and a somewhat higher rate for molars and canines. The sample size of the conventionally luted crowns was too small and the observation period too short to draw meaningful conclusions. Int J Prosthodont 2010;23:129-133.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Galactic cosmic rays (GCRs) are extremely difficult to shield against and pose one of the most severe long-term hazards for human exploration of space. The recent solar minimum between solar cycles 23 and 24 shows a prolonged period of reduced solar activity and low interplanetary magnetic field strengths. As a result, the modulation of GCRs is very weak, and the fluxes of GCRs are near their highest levels in the last 25 years in the fall of 2009. Here we explore the dose rates of GCRs in the current prolonged solar minimum and make predictions for the Lunar Reconnaissance Orbiter (LRO) Cosmic Ray Telescope for the Effects of Radiation (CRaTER), which is now measuring GCRs in the lunar environment. Our results confirm the weak modulation of GCRs leading to the largest dose rates seen in the last 25 years over a prolonged period of little solar activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lightning flash rates, RL, are modulated by corotating interaction regions (CIRs) and the polarity of the heliospheric magnetic field (HMF) in near-Earth space. As the HMF polarity reverses at the heliospheric current sheet (HCS), typically within a CIR, these phenomena are likely related. In this study, RL is found to be significantly enhanced at the HCS and at 27 days prior/after. The strength of the enhancement depends on the polarity of the HMF reversal at the HCS. Near-Earth solar and galactic energetic particle fluxes are also ordered by HMF polarity, though the variations qualitatively differ from RL, with the main increase occurring prior to the HCS crossing. Thus, the CIR effect on lightning is either the result of compression/amplification of the HMF (and its subsequent interaction with the terrestrial system) or that energetic particle preconditioning of the Earth system prior to the HMF polarity change is central to solar wind lightning coupling mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O Brasil vem vivenciando um aumento na demanda por cartões de crédito, principalmente nas classes baixas. Entretanto, a população de menor renda e menor qualificação representa maior riscos para a operação. Este fato é evidenciado pelas altas taxas de inadimplência. Exposto isso, empresas se utilizam de estratégias de renegociação de dívida na tentativa de recuperar parte do investimento realizado. Entretanto, poucos foram os estudos acerca da consequência no longo prazo destas estratégias. Utilizando os experimentos realizados por uma empresa de cartão de crédito, cujas campanhas de renegociação variavam mês a mês, este estudo, procurou evidências de que as ofertas de renegociação de dívidas podem afetar a reputação da firma, fazendo com que clientes da rede mesma rede social deste que recebeu a oferta de renegociação também fiquem inadimplentes. Concluímos que o aumento do desconto nas negociações tem um efeito significativo sobre o incentivo do cliente em honrar suas obrigações junto a empresa, ou seja, o aumento de 0,01 p.p. no desconto dado aos clientes aumenta em 0,05 sua probabilidade em atrasar sua fatura no próximo período.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is based on the integration of traditional and innovative approaches aimed at improving the normal faults seimogenic identification and characterization, focusing mainly on slip-rate estimate as a measure of the fault activity. The L’Aquila Mw 6.3 April 6, 2009 earthquake causative fault, namely the Paganica - San Demetrio fault system (PSDFS), was used as a test site. We developed a multidisciplinary and scale‐based strategy consisting of paleoseismological investigations, detailed geomorphological and geological field studies, as well as shallow geophysical imaging and an innovative application of physical properties measurements. We produced a detailed geomorphological and geological map of the PSDFS, defining its tectonic style, arrangement, kinematics, extent, geometry and internal complexities. The PSDFS is a 19 km-long tectonic structure, characterized by a complex structural setting and arranged in two main sectors: the Paganica sector to the NW, characterized by a narrow deformation zone, and the San Demetrio sector to SE, where the strain is accommodated by several tectonic structures, exhuming and dissecting a wide Quaternary basin, suggesting the occurrence of strain migration through time. The integration of all the fault displacement data and age constraints (radiocarbon dating, optically stimulated luminescence (OSL) and tephrochronology) helped in calculating an average Quaternary slip-rate representative for the PSDFS of 0.27 - 0.48 mm/yr. On the basis of its length (ca. 20 km) and slip per event (up to 0.8 m) we also estimated a max expected Magnitude of 6.3-6.8 for this fault. All these topics have a significant implication in terms of surface faulting hazard in the area and may contribute also to the understanding of the PSDFS seismic behavior and of the local seismic hazard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.