899 resultados para Design methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Swedens recent report on Urban Sustainable Development calls out a missing link between the urban design process and citizens. This paper investigates if engaging citizens as design agents by providing a platform for alternate participation can bridge this gap, through the transfer of spatial agency and new modes of critical cartography. To assess whether this is the case, the approaches are applied to Stockholms urban agriculture movement in a staged intervention. The aim of the intervention was to engage citizens in locating existing and potential places for growing food and in gathering information from these sites to inform design in urban agriculture. The design-based methodologies incorporated digital and bodily interfaces for this cartography to take place. The Urban CoMapper, a smartphone digital app, captured real-time perspectives through crowd-sourced mapping. In the bodily cartography, participants used their bodies to trace the site and reveal their sensorial perceptions. The data gathered from these approaches gave way to a mode of artistic research for exploring urban agriculture, along with inviting artists to be engaged in the dialogues. In sum, results showed that a combination of digital and bodily approaches was necessary for a critical cartography if we want to engage citizens holistically into the urban design process as spatial agents informing urban policy. Such methodologies formed a reflective interrogation and encouraged a new intimacy with nature, in this instance, one that can transform our urban conduct by questioning our eating habits: where we get our food from and how we eat it seasonally.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Our new molecular understanding of immune priming states that dendritic cell activation is absolutely pivotal for expansion and differentiation of nave T lymphocytes, and it follows that understanding DC activation is essential to understand and design vaccine adjuvants. This chapter describes how dendritic cells can be used as a core tool to provide detailed quantitative and predictive immunomics information about how adjuvants function. The role of distinct antigen, costimulation, and differentiation signals from activated DC in priming is explained. Four categories of input signals which control DC activation direct pathogen detection, sensing of injury or cell death, indirect activation via endogenous proinflammatory mediators, and feedback from activated T cells are compared and contrasted. Practical methods for studying adjuvants using DC are summarised and the importance of DC subset choice, simulating T cell feedback, and use of knockout cells is highlighted. Finally, five case studies are examined that illustrate the benefit of DC activation analysis for understanding vaccine adjuvant function.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Although the release of cardiac biomarkers after percutaneous (PCI) or surgical revascularization (CABG) is common, its prognostic significance is not known. Questions remain about the mechanisms and degree of correlation between the release, the volume of myocardial tissue loss, and the long-term significance. Delayed-enhancement of cardiac magnetic resonance (CMR) consistently quantifies areas of irreversible myocardial injury. To investigate the quantitative relationship between irreversible injury and cardiac biomarkers, we will evaluate the extent of irreversible injury in patients undergoing PCI and CABG and relate it to postprocedural modifications in cardiac biomarkers and long-term prognosis. Methods/Design: The study will include 150 patients with multivessel coronary artery disease (CAD) with left ventricle ejection fraction (LVEF) and a formal indication for CABG; 50 patients will undergo CABG with cardiopulmonary bypass (CPB); 50 patients with the same arterial and ventricular condition indicated for myocardial revascularization will undergo CABG without CPB; and another 50 patients with CAD and preserved ventricular function will undergo PCI using stents. All patients will undergo CMR before and after surgery or PCI. We will also evaluate the release of cardiac markers of necrosis immediately before and after each procedure. Primary outcome considered is overall death in a 5-year follow-up. Secondary outcomes are levels of CK-MB isoenzyme and I-Troponin in association with presence of myocardial fibrosis and systolic left ventricle dysfunction assessed by CMR. Discussion: The MASS-V Trial aims to establish reliable values for parameters of enzyme markers of myocardial necrosis in the absence of manifest myocardial infarction after mechanical interventions. The establishments of these indices have diagnostic value and clinical prognosis and therefore require relevant and different therapeutic measures. In daily practice, the inappropriate use of these necrosis markers has led to misdiagnosis and therefore wrong treatment. The appearance of a more sensitive tool such as CMR provides an unprecedented diagnostic accuracy of myocardial damage when correlated with necrosis enzyme markers. We aim to correlate laboratory data with imaging, thereby establishing more refined data on the presence or absence of irreversible myocardial injury after the procedure, either percutaneous or surgical, and this, with or without the use of cardiopulmonary bypass.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Eliminadas las pginas en blanco

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the work is: define and calculate a factor of collapse related to traditional method to design sheet pile walls. Furthermore, we tried to find the parameters that most influence a finite element model representative of this problem. The text is structured in this way: from chapter 1 to 5, we analyzed a series of arguments which are usefull to understanding the problem, while the considerations mainly related to the purpose of the text are reported in the chapters from 6 to 10. In the first part of the document the following arguments are shown: what is a sheet pile wall, what are the codes to be followed for the design of these structures and what they say, how can be formulated a mathematical model of the soil, some fundamentals of finite element analysis, and finally, what are the traditional methods that support the design of sheet pile walls. In the chapter 6 we performed a parametric analysis, giving an answer to the second part of the purpose of the work. Comparing the results from a laboratory test for a cantilever sheet pile wall in a sandy soil, with those provided by a finite element model of the same problem, we concluded that:in modelling a sandy soil we should pay attention to the value of cohesion that we insert in the model (some programs, like Abaqus, dont accept a null value for this parameter), friction angle and elastic modulus of the soil, they influence significantly the behavior of the system (structure-soil), others parameters, like the dilatancy angle or the Poissons ratio, they dont seem influence it. The logical path that we followed in the second part of the text is reported here. We analyzed two different structures, the first is able to support an excavation of 4 m, while the second an excavation of 7 m. Both structures are first designed by using the traditional method, then these structures are implemented in a finite element program (Abaqus), and they are pushed to collapse by decreasing the friction angle of the soil. The factor of collapse is the ratio between tangents of the initial friction angle and of the friction angle at collapse. At the end, we performed a more detailed analysis of the first structure, observing that, the value of the factor of collapse is influenced by a wide range of parameters including: the value of the coefficients assumed in the traditional method and by the relative stiffness of the structure-soil system. In the majority of cases, we found that the value of the factor of collapse is between and 1.25 and 2. With some considerations, reported in the text, we can compare the values so far found, with the value of the safety factor proposed by the code (linked to the friction angle of the soil).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Men who have sex with men (MSM) remain the group most at risk of acquiring HIV infection in Britain. HIV prevalence appears to vary widely between MSM from different ethnic minority groups in this country for reasons that are not fully understood. The aim of the MESH project was to examine in detail the sexual health of ethnic minority MSM living in Britain. Methods/Design The main objectives of the MESH project were to explore among ethnic minority MSM living in Britain: (i) sexual risk behaviour and HIV prevalence; (ii) their experience of stigma and discrimination; (iii) disclosure of sexuality; (iv) use of, and satisfaction with sexual health services; (v) the extent to which sexual health services (for treatment and prevention) are aware of the needs of ethnic minority MSM. The research was conducted between 2006 and 2008 in four national samples: (i) ethnic minority MSM living in Britain; (ii) a comparison group of white British MSM living in Britain; (iii) NHS sexual health clinic staff in 15 British towns and cities with significant ethnic minority communities and; (iv) sexual health promotion/HIV prevention service providers. We also recruited men from two "key migrant" groups living in Britain: MSM born in Central or Eastern Europe and MSM born in Central or South America. Internet-based quantitative and qualitative research methods were used. Ethnic minority MSM were recruited through advertisements on websites, in community venues, via informal networks and in sexual health clinics. White and "key migrant" MSM were recruited mostly through Gaydar, one of the most popular dating sites used by gay men in Britain. MSM who agreed to take part completed a questionnaire online. Ethnic minority MSM who completed the online questionnaire were asked if they would be willing to take part in an online qualitative interview using email. Service providers were identified through the British Association of Sexual Health and HIV (BASHH) and the Terrence Higgins Trust (THT) CHAPS partnerships. Staff who agreed to take part were asked to complete a questionnaire online. The online survey was completed by 1241 ethnic minority MSM, 416 men born in South and Central America or Central and Eastern Europe, and 13,717 white British MSM; 67 ethnic minority MSM took part in the online qualitative interview. In addition 364 people working in sexual health clinics and 124 health promotion workers from around Britain completed an online questionnaire. Discussion The findings from this study will improve our understanding of the sexual health and needs of ethnic minority MSM in Britain.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Continuous advancements in technology have led to increasingly comprehensive and distributed product development processes while in pursuit of improved products at reduced costs. Information associated with these products is ever changing, and structured frameworks have become integral to managing such fluid information. Ontologies and the Semantic Web have emerged as key alternatives for capturing product knowledge in both a human-readable and computable manner. The primary and conclusive focus of this research is to characterize relationships formed within methodically developed distributed design knowledge frameworks to ultimately provide a pervasive real-time awareness in distributed design processes. Utilizing formal logics in the form of the Semantic Webs OWL and SWRL, causal relationships are expressed to guide and facilitate knowledge acquisition as well as identify contradictions between knowledge in a knowledge base. To improve the efficiency during both the development and operational phases of these intelligent frameworks, a semantic relatedness algorithm is designed specifically to identify and rank underlying relationships within product development processes. After reviewing several semantic relatedness measures, three techniques, including a novel meronomic technique, are combined to create AIERO, the Algorithm for Identifying Engineering Relationships in Ontologies. In determining its applicability and accuracy, AIERO was applied to three separate, independently developed ontologies. The results indicate AIERO is capable of consistently returning relatedness values one would intuitively expect. To assess the effectiveness of AIERO in exposing underlying causal relationships across product development platforms, a case study involving the development of an industry-inspired printed circuit board (PCB) is presented. After instantiating the PCB knowledge base and developing an initial set of rules, FIDOE, the Framework for Intelligent Distributed Ontologies in Engineering, was employed to identify additional causal relationships through extensional relatedness measurements. In a conclusive PCB redesign, the resulting intelligent framework demonstrates its ability to pass values between instances, identify inconsistencies amongst instantiated knowledge, and identify conflicting values within product development frameworks. The results highlight how the introduced semantic methods can enhance the current knowledge acquisition, knowledge management, and knowledge validation capabilities of traditional knowledge bases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Through the correct implementation of lean manufacturing methods, a company can greatly improve their business. Over a period of three months at TTM Technologies, I utilized my knowledge to fix existing problems ans streamline production. In addition, other trouble areas in their production process were discovered and proper lean methods were used to address them. TTM Technologies saw many changed in the right direction over this time period.