737 resultados para Evidence Based Design (EBD)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Neovascular age-related macular degeneration (AMD) has a poor prognosis if left untreated, frequently resulting in legal blindness. Ranibizumab is approved for treating neovascular AMD. However, further guidance is needed to assist ophthalmologists in clinical practice to optimise treatment outcomes. METHODS: An international retina expert panel assessed evidence available from prospective, multicentre studies evaluating different ranibizumab treatment schedules (ANCHOR, MARINA, PIER, SAILOR, SUSTAIN and EXCITE) and a literature search to generate evidence-based and consensus recommendations for treatment indication and assessment, retreatment and monitoring. RESULTS: Ranibizumab is indicated for choroidal neovascular lesions with active disease, the clinical parameters of which are outlined. Treatment initiation with three consecutive monthly injections, followed by continued monthly injections, has provided the best visual-acuity outcomes in pivotal clinical trials. If continued monthly injections are not feasible after initiation, a flexible strategy appears viable, with monthly monitoring of lesion activity recommended. Initiation regimens of fewer than three injections have not been assessed. Continuous careful monitoring with flexible retreatment may help avoid vision loss recurring. Standardised biomarkers need to be determined. CONCLUSION: Evidence-based guidelines will help to optimise treatment outcomes with ranibizumab in neovascular AMD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Screening people without symptoms of disease is an attractive idea. Screening allows early detection of disease or elevated risk of disease, and has the potential for improved treatment and reduction of mortality. The list of future screening opportunities is set to grow because of the refinement of screening techniques, the increasing frequency of degenerative and chronic diseases, and the steadily growing body of evidence on genetic predispositions for various diseases. But how should we decide on the diseases for which screening should be done and on recommendations for how it should be implemented? We use the examples of prostate cancer and genetic screening to show the importance of considering screening as an ongoing population-based intervention with beneficial and harmful effects, and not simply the use of a test. Assessing whether screening should be recommended and implemented for any named disease is therefore a multi-dimensional task in health technology assessment. There are several countries that already use established processes and criteria to assess the appropriateness of screening. We argue that the Swiss healthcare system needs a nationwide screening commission mandated to conduct appropriate evidence-based evaluation of the impact of proposed screening interventions, to issue evidence-based recommendations, and to monitor the performance of screening programmes introduced. Without explicit processes there is a danger that beneficial screening programmes could be neglected and that ineffective, and potentially harmful, screening procedures could be introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Randomised controlled trial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this article was to record reporting characteristics related to study quality of research published in major specialty dental journals with the highest impact factor (Journal of Endodontics, Journal of Oral and Maxillofacial Surgery, American Journal of Orthodontics and Dentofacial Orthopedics; Pediatric Dentistry, Journal of Clinical Periodontology, and International Journal of Prosthetic Dentistry). The included articles were classified into the following 3 broad subject categories: (1) cross-sectional (snap-shot), (2) observational, and (3) interventional. Multinomial logistic regression was conducted for effect estimation using the journal as the response and randomization, sample calculation, confounding discussed, multivariate analysis, effect measurement, and confidence intervals as the explanatory variables. The results showed that cross-sectional studies were the dominant design (55%), whereas observational investigations accounted for 13%, and interventions/clinical trials for 32%. Reporting on quality characteristics was low for all variables: random allocation (15%), sample size calculation (7%), confounding issues/possible confounders (38%), effect measurements (16%), and multivariate analysis (21%). Eighty-four percent of the published articles reported a statistically significant main finding and only 13% presented confidence intervals. The Journal of Clinical Periodontology showed the highest probability of including quality characteristics in reporting results among all dental journals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To identify reasons for ordering computed tomography pulmonary angiography (CTPA), to identify the frequency of reasons for CTPA reflecting defensive behavior and evidence-based behavior, and to identify the impact of defensive medicine and of training about diagnosing pulmonary embolism (PE) on positive results of CTPA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades there has been a marked decline in most ortolan bunting Emberiza hortulana populations in temperate Europe, with many regional populations now extinct or on the brink of extinction. In contrast, Mediterranean and, as far as we know, eastern European popula-tions seem to have remained relatively stable. The causes of decline remain unclear but include: habitat loss and degradation, and related reduction in prey availability; climate change on the breeding grounds; altered population dynamics; illegal captures during migration; and environmental change in wintering areas. We review the current knowledge of the biology of the ortolan bunting and discuss the proposed causes of decline in relation to the different population trends in temperate and Mediterranean Europe. We suggest new avenues of research to identify the factors limiting ortolan bunting populations. The main evidence-based conservation measure that is likely to enhance habitat quality is the creation of patches of bare ground to produce sparsely vegetated foraging grounds in invertebrate-rich grassy habitats close to breeding areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To simultaneously determine perceived vs. practiced adherence to recommended interventions for the treatment of severe sepsis or septic shock. DESIGN: One-day cross-sectional survey. SETTING: Representative sample of German intensive care units stratified by hospital size. PATIENTS: Adult patients with severe sepsis or septic shock. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Practice recommendations were selected by German Sepsis Competence Network (SepNet) investigators. External intensivists visited intensive care units randomly chosen and asked the responsible intensive care unit director how often these recommendations were used. Responses "always" and "frequently" were combined to depict perceived adherence. Thereafter patient files were audited. Three hundred sixty-six patients on 214 intensive care units fulfilled the criteria and received full support. One hundred fifty-two patients had acute lung injury or acute respiratory distress syndrome. Low-tidal volume ventilation < or = 6 mL/kg/predicted body weight was documented in 2.6% of these patients. A total of 17.1% patients had tidal volume between 6 and 8 mL/kg predicted body weight and 80.3% > 8 mL/kg predicted body weight. Mean tidal volume was 10.0 +/- 2.4 mL/kg predicted body weight. Perceived adherence to low-tidal volume ventilation was 79.9%. Euglycemia (4.4-6.1 mmol/L) was documented in 6.2% of 355 patients. A total of 33.8% of patients had blood glucose levels < or = 8.3 mmol/L and 66.2% were hyperglycemic (blood glucose > 8.3 mmol/L). Among 207 patients receiving insulin therapy, 1.9% were euglycemic, 20.8% had blood glucose levels < or = 8.3 mmol/L, and 1.0% were hypoglycemic. Overall, mean maximal glucose level was 10.0 +/- 3.6 mmol/L. Perceived adherence to strict glycemic control was 65.9%. Although perceived adherence to recommendations was higher in academic and larger hospitals, actual practice was not significantly influenced by hospital size or university affiliation. CONCLUSIONS: This representative survey shows that current therapy of severe sepsis in German intensive care units complies poorly with practice recommendations. Intensive care unit directors perceive adherence to be higher than it actually is. Implementation strategies involving all intensive care unit staff are needed to overcome this gap between current evidence-based knowledge, practice, and perception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project consists of a proposed curriculum for a semester-long, community-based workshop for LGBTQIA+ (lesbian, gay, bisexual, trans*, queer or questioning, intersex, asexual or ally, "+" indicating other identifications that deviate from heterosexual) youth ages 16-18. The workshop focuses on an exploration of LGBTQIA+ identity and community through discussion and collaborative rhetorical analysis of visual and social media. Informed by queer theory and history, studies on youth work, and visual media studies and incorporating rhetorical criticism as well as liberatory pedagogy and community literacy practices, the participation-based design of the workshop seeks to involve participants in selection of media texts, active analytical viewership, and multimodal response. The workshop is designed to engage participants in reflection on questions of individual and collective responsibility and agency as members and allies of various communities. The goal of the workshop is to strengthen participants' abilities to analyze the complex ways in which television, film, and social media influence their own and others’ perceptions of issues surrounding queer identities. As part of the reflective process, participants are challenged to consider how they can in turn actively and collaboratively respond to and potentially help to shape these perceptions. My project report details the theoretical framework, pedagogical rationale, methods of text selection and critical analysis, and guidelines for conduct that inform and structure the workshop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On one side, prosthodontic reconstructions compensate for the sequelae of negative changes in the oral cavity; on the other side, they often enhance or accelerate them. As a consequence of negative changes in the oral cavity over time, treatment planning for RPDs becomes highly complex. A set of reliable criteria is necessary for decision-making and problem management It appears that the majority of published data on RPDs does not depict high effectiveness of this treatment modality. From a strict point of view of evidence-based dentistry, the level of evidence is low if not missing for RPDs. Randomized controlled trials on RPDs are difficult to design, they are not feasible for some questions due to the complexity of the material, or may remain without clinical relevance. The literature rarely gives information on the denture design, tooth selection, and management of the compromised structural integrity of teeth. So far treatment outcomes with RPDs must be considered under the aspect of bias due to the bias in indication and patient selection for RPDs. Better clinical models should be elaborated with more stringent concepts for providing RPDs. This encompasses: risk analysis and patient assessment, proper indications for maintenance or extraction of teeth, strategic placement of implants, biomechanical aspects, materials, and technology. Although there is a tendency to offer fixed prostheses to our patients, this might change again with demographic changes and with an increase in the ageing population, an increase in their reduced dentition, and low socioeconomic wealth in large parts of the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rise of evidence-based medicine as well as important progress in statistical methods and computational power have led to a second birth of the >200-year-old Bayesian framework. The use of Bayesian techniques, in particular in the design and interpretation of clinical trials, offers several substantial advantages over the classical statistical approach. First, in contrast to classical statistics, Bayesian analysis allows a direct statement regarding the probability that a treatment was beneficial. Second, Bayesian statistics allow the researcher to incorporate any prior information in the analysis of the experimental results. Third, Bayesian methods can efficiently handle complex statistical models, which are suited for advanced clinical trial designs. Finally, Bayesian statistics encourage a thorough consideration and presentation of the assumptions underlying an analysis, which enables the reader to fully appraise the authors' conclusions. Both Bayesian and classical statistics have their respective strengths and limitations and should be viewed as being complementary to each other; we do not attempt to make a head-to-head comparison, as this is beyond the scope of the present review. Rather, the objective of the present article is to provide a nonmathematical, reader-friendly overview of the current practice of Bayesian statistics coupled with numerous intuitive examples from the field of oncology. It is hoped that this educational review will be a useful resource to the oncologist and result in a better understanding of the scope, strengths, and limitations of the Bayesian approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As contentions continue to engulf the evidence-based practice (EBP) debate within social care, consensus seems to be gravitating towards the softer term of ‘evidence-aware’ practice, although there is as yet no definitive concept on the horizon. Set against a back-drop of competing ideologies, heavily influenced by the natural sciences, what is at stake is the essence of social work: there is a real danger that social work practice be reduced to mere base elements, which seek to eliminate all notions of uncertainly, so essential and endemic to our social world

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, desertification research has focused on degradation assessments, whereas prevention and mitigation strategies have not sufficiently been emphasised, although the concept of sustainable land management (SLM) is increasingly being acknowledged. SLM strategies are interventions at the local to regional scale aiming at increasing productivity, protecting the natural resource base, and improving livelihoods. The global WOCAT initiative and its partners have developed harmonized frameworks to compile, evaluate and analyse the impact of SLM practices around the globe. Recent studies within the EU research project DESIRE developed a methodological framework that combines a collective learning and decision-making approach with use of best practices from the WOCAT database. In-depth assessment of 30 technologies and 8 approaches from 17 desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats such as water scarcity, soil and vegetation degradation, low production, climate change, resource use conflicts and migration. Among the impacts attributed to the documented technologies, those mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Water harvesting offers under-exploited opportunities for the drylands and the predominantly rainfed farming systems of the developing world. Recently compiled guidelines introduce the concepts behind water harvesting and propose a harmonised classification system, followed by an assessment of suitability, adoption and up-scaling of practices. Case studies go from large-scale floodwater spreading that make alluvial plains cultivable, to systems that boost cereal production in small farms, as well as practices that collect and store water from household compounds. Once contextualized and set in appropriate institutional frameworks, they can form part of an overall adaptation strategy for land users. More field research is needed to reinforce expert assessments of SLM impacts and provide the necessary evidence-based rationale for investing in SLM. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the new EU CASCADE project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether algorithms developed for the World Wide Web can be applied to the biomedical literature in order to identify articles that are important as well as relevant. DESIGN AND MEASUREMENTS A direct comparison of eight algorithms: simple PubMed queries, clinical queries (sensitive and specific versions), vector cosine comparison, citation count, journal impact factor, PageRank, and machine learning based on polynomial support vector machines. The objective was to prioritize important articles, defined as being included in a pre-existing bibliography of important literature in surgical oncology. RESULTS Citation-based algorithms were more effective than noncitation-based algorithms at identifying important articles. The most effective strategies were simple citation count and PageRank, which on average identified over six important articles in the first 100 results compared to 0.85 for the best noncitation-based algorithm (p < 0.001). The authors saw similar differences between citation-based and noncitation-based algorithms at 10, 20, 50, 200, 500, and 1,000 results (p < 0.001). Citation lag affects performance of PageRank more than simple citation count. However, in spite of citation lag, citation-based algorithms remain more effective than noncitation-based algorithms. CONCLUSION Algorithms that have proved successful on the World Wide Web can be applied to biomedical information retrieval. Citation-based algorithms can help identify important articles within large sets of relevant results. Further studies are needed to determine whether citation-based algorithms can effectively meet actual user information needs.