878 resultados para design-based inference


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia de Materiais.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Biologia Ambiental e Molecular

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doctoral Dissertation for PhD degree in Chemical and Biological Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Bioquímica Aplicada (área de especialização em Biotecnologia)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the research and development of sustainable design guidelines for the furniture and wood products industry, suitable for sustainably enhancing design, manufacturing and associated activities. This sustainable guideline is based on secondary research conducted on subject areas such as ‘eco’ design, ‘green’ branding and ‘green’ consumerism, as well as an examination of existing certifications and sustainable tools techniques and methodologies, national and international drivers for sustainable development and an overview of sustainability in the Irish furniture manufacturing context. The guideline was further developed through primary research. This consisted of a focus group attended by leading Irish designers, manufacturers and academics in the area of furniture and wood products. This group explored the question of ‘green branding’ saturation in the market and the viability of investing in sustainability just yet. Participants stated that they felt the market for ‘green’ products is evolving very slowly and that there is no metric or legal framework present to audit whether or not companies are producing products that really embody sustainability. All the participants believed that developing and introducing a new certification process to incorporate a sustainable design process was a viable and necessary solution to protecting Irish furniture and wood manufacturers going forward. For the purposes of the case study, the author investigated a ‘sustainable’ design process for Team woodcraft, Ltd., through the design and development of a ‘sustainable’ children’s furniture range. The case study followed a typical design and development process; detailing customer design specifications, concept development and refinement and cumulating in final prototype, as well as associated engineering drawings. Based on this primary and secondary research, seven fundamental core principles for this sustainable guideline have been identified by the author. The author then used these core principles to expand into guidelines for the basis of proposed new Irish sustainable design guidelines for the furniture and wood products industry, the concept of which the author has named ‘Green Dot’. The author suggests that the ‘Green Dot’ brand or logo could be used to market an umbrella network of Irish furniture designers and manufactures who implement the recommended sustainable techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-core processors is a design philosophy that has become mainstream in scientific and engineering applications. Increasing performance and gate capacity of recent FPGA devices has permitted complex logic systems to be implemented on a single programmable device. By using VHDL here we present an implementation of one multi-core processor by using the PLASMA IP core based on the (most) MIPS I ISA and give an overview of the processor architecture and share theexecution results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The distally based anterolateral thigh (ALT) flap is an interesting reconstructive solution for complex soft tissue defects of the knee. In spite of a low donor site morbidity and wide covering surface as well as arch of rotation, it has never gained popularity among reconstructive surgeons. Venous congestion and difficult flap dissection in the presence of a variable anatomy of the vascular pedicle are the possible reasons.Methods An anatomical study of 15 cadaver legs was performed to further clarify the blood supply of the distally based ALT. Our early experience with the use of preoperative angiography and a safe flap design modification that avoids distal intramuscular skeletonization of the vascular pedicle and includes a subcutaneous strip ranging from the distal end of the flap to the pivot point is presented.Results The distally based ALT presents a constant and reliable retrograde vascular contribution from the superior genicular artery. Preoperative angiography reliably identified and avoided critical Shieh Type II pedicled flaps. The preservation of a subcutaneous strip ranging from the distal flap end to the upper knee was associated with the absence of venous congestion in a short case series.Conclusions Preoperative angiography and a flap design modification are proposed to allow the safe transfer of the distally based ALT to reconstruct soft tissue defects of the knee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CD8 T cells play a key role in mediating protective immunity against selected pathogens after vaccination. Understanding the mechanism of this protection is dependent upon definition of the heterogeneity and complexity of cellular immune responses generated by different vaccines. Here, we identify previously unrecognized subsets of CD8 T cells based upon analysis of gene-expression patterns within single cells and show that they are differentially induced by different vaccines. Three prime-boost vector combinations encoding HIV Env stimulated antigen-specific CD8 T-cell populations of similar magnitude, phenotype, and functionality. Remarkably, however, analysis of single-cell gene-expression profiles enabled discrimination of a majority of central memory (CM) and effector memory (EM) CD8 T cells elicited by the three vaccines. Subsets of T cells could be defined based on their expression of Eomes, Cxcr3, and Ccr7, or Klrk1, Klrg1, and Ccr5 in CM and EM cells, respectively. Of CM cells elicited by DNA prime-recombinant adenoviral (rAd) boost vectors, 67% were Eomes(-) Ccr7(+) Cxcr3(-), in contrast to only 7% and 2% stimulated by rAd5-rAd5 or rAd-LCMV, respectively. Of EM cells elicited by DNA-rAd, 74% were Klrk1(-) Klrg1(-)Ccr5(-) compared with only 26% and 20% for rAd5-rAd5 or rAd5-LCMV. Definition by single-cell gene profiling of specific CM and EM CD8 T-cell subsets that are differentially induced by different gene-based vaccines will facilitate the design and evaluation of vaccines, as well as enable our understanding of mechanisms of protective immunity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on identification and exploitation processes among Finnish design entrepreneurs (i.e. selfemployed industrial designers). More specifically, this study strives to find out what design entrepreneurs do when they create new ventures, how venture ideas are identified and how entrepreneurial processes are organized to identify and exploit such venture ideas in the given industrial context. Indeed, what does educated and creative individuals do when they decide to create new ventures, where do the venture ideas originally come from, and moreover, how are venture ideas identified and developed into viable business concepts that are introduced on the markets? From an academic perspective: there is a need to increase our understanding of the interaction between the identification and exploitation of emerging ventures, in this and other empirical contexts. Rather than assuming that venture ideas are constant in time, this study examines how emerging ideas are adjusted to enable exploitation in dynamic market settings. It builds on the insights from previous entrepreneurship process research. The interpretations from the theoretical discussion build on the assumption that the subprocesses of identification and exploitation interact, and moreover, they are closely entwined with each other (e.g. McKelvie & Wiklund, 2004, Davidsson, 2005). This explanation challenges the common assumption that entrepreneurs would first identify venture ideas and then exploit them (e.g. Shane, 2003). The assumption is that exploitation influences identification, just as identification influences exploitation. Based on interviews with design entrepreneurs and external actors (e.g. potential customers, suppliers and collaborators), it appears as identification and exploitation of venture ideas are carried out in close interaction between a number of actors, rather than alone by entrepreneurs. Due to their available resources, design entrepreneurs have a desire to focus on identification related activities and to find external actors that take care of exploitation related activities. The involvement of external actors may have a direct impact on decisionmaking and various activities along the processes of identification and exploitation, which is something that previous research does not particularly emphasize. For instance, Bhave (1994) suggests both operative and strategic feedback from the market, but does not explain how external parties are actually involved in the decisionmaking, and in carrying out various activities along the entrepreneurial process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En aquesta memòria l'autor, fent servir un enfoc modern, redissenya i implementa la plataforma que una empresa de telecomunicacions del segle 21 necessita per poder donar serveis de telefonia i comunicacions als seus usuaris i clients. Al llarg d'aquesta exposició es condueix al lector des d'una fase inicial de disseny fins a la implementació i posada en producció del sistema final desenvolupat, centrant-nos en solucionar les necessitats actuals que això implica. Aquesta memòria cubreix el software, hardware i els processos de negoci associats al repte de fer realitat aquest objectiu, i presenta al lector les múltiples tecnologies emprades per aconseguir-ho, fent emfàsi en la convergència actual de xarxes cap al concepte de xarxes IP i basant-se en aquesta tendència i utilitzant aquesta tecnologia de veu sobre IP per donar forma a la plataforma que finalment, de forma pràctica, es posa en producció.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Little information is available on the validity of simple and indirect body-composition methods in non-Western populations. Equations for predicting body composition are population-specific, and body composition differs between blacks and whites. OBJECTIVE: We tested the hypothesis that the validity of equations for predicting total body water (TBW) from bioelectrical impedance analysis measurements is likely to depend on the racial background of the group from which the equations were derived. DESIGN: The hypothesis was tested by comparing, in 36 African women, TBW values measured by deuterium dilution with those predicted by 23 equations developed in white, African American, or African subjects. These cross-validations in our African sample were also compared, whenever possible, with results from other studies in black subjects. RESULTS: Errors in predicting TBW showed acceptable values (1.3-1.9 kg) in all cases, whereas a large range of bias (0.2-6.1 kg) was observed independently of the ethnic origin of the sample from which the equations were derived. Three equations (2 from whites and 1 from blacks) showed nonsignificant bias and could be used in Africans. In all other cases, we observed either an overestimation or underestimation of TBW with variable bias values, regardless of racial background, yielding no clear trend for validity as a function of ethnic origin. CONCLUSIONS: The findings of this cross-validation study emphasize the need for further fundamental research to explore the causes of the poor validity of TBW prediction equations across populations rather than the need to develop new prediction equations for use in Africa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.