985 resultados para epistemic closure
Crack closure and residual stress effects in fatigue of a particle-reinforced metal matrix composite
Resumo:
A study of the influence of macroscopic quenching stresses on long fatigue crack growth in an aluminium alloy-SiC composite has been made. Direct comparison between quenched plate, where high residual stresses are present, and quenched and stretched plate, where they have been eliminated, has highlighted their rôle in crack closure. Despite similar strength levels and identical crack growth mechanisms, the stretched composite displays faster crack growth rates over the complete range of ΔK, measured at R = 0.1, with threshold being displaced to a lower nominal ΔK value. Closure levels are dependent upon crack length, but are greater in the unstretched composite, due to the effect of surface compressive stresses acting to close the crack tip. These result in lower values of ΔKeff in the unstretched material, explaining the slower crack growth rates. Effective ΔKth values are measured at 1.7 MPa√m, confirmed by constant Kmax testing. In the absence of residual stress, closure levels of approximately 2.5 MPa√m are measured and this is attributed to a roughness mechanism.
Resumo:
We present a simplified model for a simple estimation of the eye-closure penalty for amplitude noise-degraded signals. Using a typical 40-Gbit/s return-to-zero amplitude-shift-keying transmission, we demonstrate agreement between the model predictions and the results obtained from the conventional numerical estimation method over several thousand kilometers.
Resumo:
2000 Mathematics Subject Classification: 16N80, 16S70, 16D25, 13G05.
Resumo:
In this article I argue that the study of the linguistic aspects of epistemology has become unhelpfully focused on the corpus-based study of hedging and that a corpus-driven approach can help to improve upon this. Through focusing on a corpus of texts from one discourse community (that of genetics) and identifying frequent tri-lexical clusters containing highly frequent lexical items identified as keywords, I undertake an inductive analysis identifying patterns of epistemic significance. Several of these patterns are shown to be hedging devices and the whole corpus frequencies of the most salient of these, candidate and putative, are then compared to the whole corpus frequencies for comparable wordforms and clusters of epistemic significance. Finally I interviewed a ‘friendly geneticist’ in order to check my interpretation of some of the terms used and to get an expert interpretation of the overall findings. In summary I argue that the highly unexpected patterns of hedging found in genetics demonstrate the value of adopting a corpus-driven approach and constitute an advance in our current understanding of how to approach the relationship between language and epistemology.
Resumo:
In recent years there have been a number of high-profile plant closures in the UK. In several cases, the policy response has included setting up a task force to deal with the impacts of the closure. It can be hypothesised that task force involving multi-level working across territorial boundaries and tiers of government is crucial to devising a policy response tailored to people's needs and to ensuring success in dealing with the immediate impacts of a closure. This suggests that leadership, and vision, partnership working and community engagement, and delivery of high quality services are important. This paper looks at the case of the MG Rover closure in 2005, to examine the extent to which the policy response to the closure at the national, regional and local levels dealt effectively with the immediate impacts of the closure, and the lessons that can be learned from the experience. Such lessons are of particular relevance given the closure of the LDV van plant in Birmingham in 2009 and more broadly – such as in the case of the downsizing of the Opel operation in Europe following its takeover by Magna.
Resumo:
Meier (2012) gave a "mathematical logic foundation" of the purely measurable universal type space (Heifetz and Samet, 1998). The mathematical logic foundation, however, discloses an inconsistency in the type space literature: a finitary language is used for the belief hierarchies and an infinitary language is used for the beliefs. In this paper we propose an epistemic model to fix the inconsistency above. We show that in this new model the universal knowledgebelief space exists, is complete and encompasses all belief hierarchies. Moreover, by examples we demonstrate that in this model the players can agree to disagree Aumann (1976)'s result does not hold, and Aumann and Brandenburger (1995)'s conditions are not sufficient for Nash equilibrium. However, we show that if we substitute selfevidence (Osborne and Rubinstein, 1994) for common knowledge, then we get at that both Aumann (1976)'s and Aumann and Brandenburger (1995)'s results hold.
Resumo:
Auditor decisions regarding the causes of accounting misstatements can have an audit effectiveness and efficiency. Specifically, overconfidence in one's decision can lead to an ineffective audit, whereas underconfidence in one's decision can lead to an inefficient audit. This dissertation explored the implications of providing various types of information cues to decision-makers regarding an Analytical Procedure task and investigated the relationship between different types of evidence cues (confirming, disconfirming, redundant or non-redundant) and the reduction in calibration bias. Information was collected using a laboratory experiment, from 45 accounting students participants. Research questions were analyzed using a 2 x 2 x 2 between-subject and within-subject analysis of covariance (ANCOVA). ^ Results indicated that presenting subjects with information cues dissimilar to the choice they made is an effective intervention in reducing the common overconfidence found in decision-making. In addition, other information characteristics, specifically non-redundant information can help in reducing a decision-maker's overconfidence/calibration bias for difficulty (compared to easy) decision-tasks. ^
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Léon Walras (1874) already had realized that his neo-classical general equilibrium model could not accommodate autonomous investment. Sen analysed the same issue in a simple, one-sector macroeconomic model of a closed economy. He showed that fixing investment in the model, built strictly on neo-classical assumptions, would make the system overdetermined, thus, one should loosen some neo-classical condition of competitive equilibrium. He analysed three not neo-classical “closure options”, which could make the model well determined in the case of fixed investment. Others later extended his list and it showed that the closure dilemma arises in the more complex computable general equilibrium (CGE) models as well, as does the choice of adjustment mechanism assumed to bring about equilibrium at the macro level. By means of numerical models, it was also illustrated that the adopted closure rule can significantly affect the results of policy simulations based on a CGE model. Despite these warnings, the issue of macro closure is often neglected in policy simulations. It is, therefore, worth revisiting the issue and demonstrating by further examples its importance, as well as pointing out that the closure problem in the CGE models extends well beyond the problem of how to incorporate autonomous investment into a CGE model. Several closure rules are discussed in this paper and their diverse outcomes are illustrated by numerical models calibrated on statistical data. First, the analyses is done in a one-sector model, similar to Sen’s, but extended into a model of an open economy. Next, the same analyses are repeated using a fully-fledged multisectoral CGE model, calibrated on the same statistical data. Comparing the results obtained by the two models it is shown that although, using the same closure option, they generate quite similar results in terms of the direction and – to a somewhat lesser extent – of the magnitude of change in the main macro variables, the predictions of the multi-sectoral CGE model are clearly more realistic and balanced.
Resumo:
Acknowledgments The research for this paper was primarily funded by an Australian Research Council (ARC) grant (DP120101092), How do we know what works? Ethics and evidence in surgical research. Katrina Hutchison’s research was also partly funded by the ARC Centre of Excellence for Electromaterials Science, where she has worked since June 2015. Discussions about the paper were facilitated by Macquarie University funding of a visit by Vikki A. Entwistle to participate in a Centre for Agency, Values and Ethics (CAVE) seminar on Capabilities Approaches to Justice. The authors would like to thank the anonymous reviewers for a number of helpful comments.
Resumo:
Medical Research Council (ref G0701604) and administered by the NIHR-EME (ref 09-800-26)
Resumo:
BACKGROUND: The prevalence of residual shunt in patients after device closure of atrial septal defect and its impact on long-term outcome has not been previously defined. METHODS: From a prospective, single-institution registry of 408 patients, we selected individuals with agitated saline studies performed 1 year after closure. Baseline echocardiographic, invasive hemodynamic, and comorbidity data were compared to identify contributors to residual shunt. Survival was determined by review of the medical records and the Social Security Death Index. Survival analysis according to shunt included construction of Kaplan-Meier curves and Cox proportional hazards modeling. RESULTS: Among 213 analyzed patients, 27% were men and age at repair was 47 ± 17 years. Thirty patients (14%) had residual shunt at 1 year. Residual shunt was more common with Helex (22%) and CardioSEAL/STARFlex (40%) occluder devices than Amplatzer devices (9%; P = .005). Residual shunts were more common in whites (79% vs 46%, P = .004). At 7.3 ± 3.3 years of follow-up, 13 (6%) of patients had died, including 8 (5%) with Amplatzer, 5 (25%) with CardioSEAL/STARFlex, and 0 with Helex devices. Patients with residual shunting had a higher hazard of death (20% vs 4%, P = .001; hazard ratio 4.95 [1.59-14.90]). In an exploratory multivariable analysis, residual shunting, age, hypertension, coronary artery disease, and diastolic dysfunction were associated with death. CONCLUSIONS: Residual shunt after atrial septal defect device closure is common and adversely impacts long-term survival.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.