968 resultados para Combinatorial Designs
Resumo:
En el present projecte hem analitzat els determinants de les trajectòries educatives dels i les adolescents d'origen immigrant, centrant I'atenció en el paper de les seves famílies davant de I ‘èxit o fracàs escolar del seu fillla. Amb aquest objectiu, I'estudi combina tècniques quantitatives i qualitatives. Per una banda hem analitzat les dades longitudinals del Panel de Famílies i lnfancia, que ens permeten fer un seguiment de les trajectòries educatives i personals de 248 alumnes d'origen immigrant que al 2006 estudiaven I'ESO al llarg de la seva adolescència, i identificar els factors socials responsables de la seva diversificació. Els resultats indiquen que malgrat presentar actituds bastant favorables als estudis i I'assoliment educatiu, concentren diverses situacions de vulnerabilitat a la llar (dificultats socioeconòmiques, estructures familiars atípiques, i erosió de capital social), que incideixen negativament sobre els seus rendiments acadèmics. Per altra, hem realitzat 59 entrevistes semi-estructurades per a complementar i facilitar la interpretació dels resultats obtinguts a la recerca quantitativa i copsar les narratives dels propis protagonistes. Aquestes entrevistes s'han realitzat a: una submostra de les famílies d'aquests alumnes, seleccionades en funció de perfils d’èxit o fracàs educatiu de la trajectòria del menor (46), una submostra d'estudiants resilients (a), i una sèrie d'agents educatius i socials, que inclou membres d'equips directius de centres escolars, AMPA i entitats dedicades a I'atenció a la infància i les famílies (5). El projecte que presentem té una clara vocació de servei públic. L'objectiu és incrementar el coneixement de factors "extraescolars" que poden condicionar I ‘èxit escolar dels estudiants d'origen immigrant. Aquest coneixement constitueix la base per al disseny i orientació de programes d'acompanyament a les famílies dels infants en situació de risc. La nostra voluntat (que reflecteix el principal objectiu de I'lnstitut d’infància i Món Urbà, instituci6 que impulsa el projecte) és contribuir a la transferència de coneixement que pugui ser d'utilitat pels agents que treballen directament sobre les qüestions que estudiem.
Resumo:
Urease is an important virulence factor for Helicobacter pylori and is critical for bacterial colonization of the human gastric mucosa. Specific inhibition of urease activity has been proposed as a possible strategy to fight this bacteria which infects billions of individual throughout the world and can lead to severe pathological conditions in a limited number of cases. We have selected peptides which specifically bind and inhibit H. pylori urease from libraries of random peptides displayed on filamentous phage in the context of pIII coat protein. Screening of a highly diverse 25-mer combinatorial library and two newly constructed random 6-mer peptide libraries on solid phase H. pylori urease holoenzyme allowed the identification of two peptides, 24-mer TFLPQPRCSALLRYLSEDGVIVPS and 6-mer YDFYWW that can bind and inhibit the activity of urease purified from H. pylori. These two peptides were chemically synthesized and their inhibition constants (Ki) were found to be 47 microM for the 24-mer and 30 microM for the 6-mer peptide. Both peptides specifically inhibited the activity of H. pylori urease but not that of Bacillus pasteurii.
Resumo:
Background: Prognostic and predictive markers are of great importance for future study designs and essential for the interpretation of clinical trials incorporating an EGFR-inhibitor. The current study prospectively assessed and validated KRAS, BRAF and PIK3CA mutations in rectal cancer patients screened for the trial SAKK41/07 of concomitant preoperative radio-chemotherapy with or without panitumumab.Methods: Macrodissection was performed on pretreatment formalin fixed paraffin embedded biopsy tissue sections to arrive at a minimum of 50% of tumor cells. DNA was extracted with the Maxwell 16 FFPE Tissue LEV DNA purification kit. After PCR amplification, mutations were identified by pyrosequencing. We prospectively analysed pretreatment biopsy material from 149 rectal cancer pts biopsies for KRAS (exon 2 codon 12 [2-12] and 13 [2-13], exon 3 codon 59 [3-59]) and 61 [3-61], exon 4 codon 117 [4-117] and 146 [4-146]). Sixty-eight pts (KRASwt exon 2, 3 only) were further analysed for BRAF (exon 15 codon 600) and PIK3CA (exon 9 codon 542, 545 and 546, exon 20 codon 1043 [20-1043] and 1047 [20-1047]) mutations, and EGFR copy number by qPCR. For the calculation of the EGFR copy number, we used KRAS copy number as internal reference standard. The calculation was done on the basis of the two standard curves relative quantification method.Results: In 149 screened pts with rectal cancer, the prevalence of KRAS mutations was 36%. Among the 68 pts enrolled in SAKK 41/07 based on initially presumed KRASwt status (exon 2/codons 12+13), 18 pts (26%) had a total of 23 mutations in the RAS/PIK3CA-pathways upon validation analysis. Twelve pts had a KRAS mutation, 7 pts had a PIK3CA mutation, 3 pts had a NRAS mutation, 1 patient a BRAF mutation. Surprisingly, five of these pts had double- mutations, including 4 pts with KRAS plus PIK3CA mutations, and 1 pt with NRAS plus PIK3CA mutations. The median normalized EGFR copy number was 1. Neither mutations of KRAS, BRAF, and PIK3CA, nor EGFR copy number were statistically associated with the primary study endpoint pCR (pathological complete regression).Conclusions: The prevalence of KRAS mutations in rectal and in colon cancer appears to be similar. BRAF mutations are rare; PIK3CA mutations are more common (10%). EGFR copy number is not increased in rectal cancer. A considerable number or KRAS exon 2 wt tumors harbored KRAS exon 3+4 mutations. Further study is needed to determine if KRAS testing should include exons 2-4.
Resumo:
Human genetics has progressed at an unprecedented pace during the past 10 years. DNA microarrays currently allow screening of the entire human genome with high level of coverage and we are now entering the era of high-throughput sequencing. These remarkable technical advances are influencing the way medical research is conducted and have boosted our understanding of the structure of the human genome as well as of disease biology. In this context, it is crucial for clinicians to understand the main concepts and limitations of modern genetics. This review will describe key concepts in genetics, including the different types of genetic markers in the human genome, review current methods to detect DNA variation, describe major online public databases in genetics, explain key concepts in statistical genetics and finally present commonly used study designs in clinical and epidemiological research. This review will therefore concentrate on human genetic variation analysis.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
Devices for venous cannulation have seen significant progress over time: the original, rigid steel cannulas have evolved toward flexible plastic cannulas with wire support that prevents kinking, very thin walled wire wound cannulas allowing for percutaneous application, and all sorts of combinations. In contrast to all these rectilinear venous cannula designs, which present the same cross-sectional area over their entire intravascular path, the smartcanula concept of "collapsed insertion and expansion in situ" is the logical next step for venous access. Automatically adjusting cross-sectional area up to a pre-determined diameter or the vessel lumen provides optimal flow and ease of use for both, insertion and removal. Smartcanula performance was assessed in a small series of patients (76 +/- 17 kg) undergoing redo procedures. The calculated target pump flow (2.4 L/min/m2) was 4.42 +/- 61 L/ min. Mean pump flow achieved during cardiopulmonary bypass was 4.84 +/- 87 L/min or 110% of the target. Reduced atrial chatter, kink resistance in situ, and improved blood drainage despite smaller access orifice size, are the most striking advantages of this new device. The benefits of smart cannulation are obvious in remote cannulation for limited access cardiac surgery, but there are many other cannula applications where space is an issue, and that is where smart cannulation is most effective.
Resumo:
Estudi realitzat a partir d’una estada al Laboratoire d’études sur les monothéismes (UMR 8584, Centre national de la recherche scientifique / École pratique des hautes études / Université Paris IV-Sorbonne), França, entre 2010 i 2011. Anàlisi de la crisi estructural que afectà a l’església gal•la entre el darrer quart del segle IV i el primer del segle VI, crisi causada per la cristianització a gran escala de les elits aristocràtiques gal•loromanes i per la reivindicació per part d’aquest estament de la translació a l’esfera de la jerarquia institucional de l’Església de la seva preeminència econòmica i social. Aquest procés implicà l’aparició d’algunes interpretacions del “fet existencial cristià” que tractaven de legitimar en el plànol teòric la presa del control de les comunitats cristianes per part de la noblesa senatorial. En relació a aquest últim punt, s’ha donat particular rellevància a l’anomenada “controvèrsia semipelagiana” a Provença, amb especial èmfasi en dos punts: a) la relació entre l’oposició a la teologia agustiniana de la gràcia en alguns cercles monàstics provençals –Marsella, Lérins– i l’emergència en aquests ambients d’una literatura autobiogràfica en la que la reflexió sobre els conceptes de uocatio divina i conuersio a l’ascetisme cristià està estretament vinculada a un esforç teòric de redefinició i reorientació de l’ethos aristocràtic; i b) la relació entre els punts teològics debatuts en aquesta controvèrsia i les concepcions eclesiològiques dels pensadors que hi prengueren part –entengui’s aquí per eclesiologia la definició teòrica dels límits i dels fonaments de la “comunitat cristiana”, amb especial incidència en aquest cas en els plantejaments sobre el rol que l’aristòcrata havia d’exercir en aquestes noves comunitats “transversals”–. Aquest projecte bianual ha posat de manifest la inexistència d’una “teologia semipelagiana”, ateses les antagòniques concepcions eclesiològiques dels autors tradicionalment associats a aquesta corrent de pensament: Cassià entén la comunitat cristiana com una elit ascètica en la que els criteris “laics” d’estratificació social queden suspesos, i rebutja –en la teoria i en la pràctica– que aquesta elit hagi d’assumir el lideratge de la comunitat de fidels seglars; en els autors del cercle de Lérins, en canvi, l’oposició a la teologia agustiniana de la gràcia és inspirada per l’esforç d’importar a tota la comunitat cristiana els ideals monàstics, quelcom que fou també una via de legitimació de l’autoritat dels monjos-bisbes d’origen aristocràtic sorgits del cenobi de Lérins.
Resumo:
Critical real-time ebedded (CRTE) Systems require safe and tight worst-case execution time (WCET) estimations to provide required safety levels and keep costs low. However, CRTE Systems require increasing performance to satisfy performance needs of existing and new features. Such performance can be only achieved by means of more agressive hardware architectures, which are much harder to analyze from a WCET perspective. The main features considered include cache memòries and multi-core processors.Thus, althoug such features provide higher performance, corrent WCET analysis methods are unable to provide tight WCET estimations. In fact, WCET estimations become worse than for simple rand less powerful hardware. The main reason is the fact that hardware behavior is deterministic but unknown and, therefore, the worst-case behavior must be assumed most of the time, leading to large WCET estimations. The purpose of this project is developing new hardware designs together with WCET analysis tools able to provide tight and safe WCET estimations. In order to do so, those pieces of hardware whose behavior is not easily analyzable due to lack of accurate information during WCET analysis will be enhanced to produce a probabilistically analyzable behavior. Thus, even if the worst-case behavior cannot be removed, its probabilty can be bounded, and hence, a safe and tight WCET can be provided for a particular safety level in line with the safety levels of the remaining components of the system. During the first year the project we have developed molt of the evaluation infraestructure as well as the techniques hardware techniques to analyze cache memories. During the second year those techniques have been evaluated, and new purely-softwar techniques have been developed.
Resumo:
En els darrers anys, els sistemes de telemetria per a aplicacions mèdiques han crescut significativament en el diagnòstic i en la monitorització de, per exemple, la glucosa, la pressió de la sang, la temperatura, el ritme cardíac... Els dispositius implantats amplien les aplicacions en medicina i incorpora una millora de qualitat de vida per a l’usuari. Per aquest motiu, en aquest projecte s’estudien dues de les antenes més comuns, com son l’antena dipol i el patch, aquesta última és especialment utilitzada en aplicacions implantades. En l’anàlisi d’aquestes antenes s’han parametritzat característiques relacionades amb l’entorn de l’aplicació, així com també de la pròpia antena, explicant el comportament que, a diferencia amb l’espai lliure, les antenes presenten a canvis d’aquests paràmetres. Al mateix temps, s’ha implementat una configuració per a la mesura d’antenes implantades basat en el model del cos humà d’una capa. Comparant amb els resultats de les simulacions realitzades mitjançant el software FEKO, s’ha obtingut gran correspondència en la mesura empírica d’adaptació i de guany de les antenes microstrip. Gràcies a l’anàlisi paramètric, aquest projecte també presenta diversos dissenys de les antenes optimitzant el guany realitzable amb l’objectiu d’aconseguir la millor comunicació possible amb el dispositiu extern o estació base.
Resumo:
The interfacial micromotion is closely associated to the long-term success of cementless hip prostheses. Various techniques have been proposed to measure them, but only a few number of points over the stem surface can be measured simultaneously. In this paper, we propose a new technique based on micro-Computer Tomography (μCT) to measure locally the relative interfacial micromotions between the metallic stem and the surrounding femoral bone. Tantalum beads were stuck at the stem surface and spread at the endosteal surface. Relative micromotions between the stem and the endosteal bone surfaces were measured at different loading amplitudes. The estimated error was 10μm and the maximal micromotion was 60μm, in the loading direction, at 1400N. This pilot study provided a local measurement of the micromotions in the 3 direction and at 8 locations on the stem surface simultaneously. This technique could be easily extended to higher loads and a much larger number of points, covering the entire stem surface and providing a quasi-continuous distribution of the 3D interfacial micromotions around the stem. The new measurement method would be very useful to compare the induced micromotions of different stem designs and to optimize the primary stability of cementless total hip arthroplasty.
Resumo:
Most leadership and management researchers ignore one key design and estimation problem rendering parameter estimates uninterpretable: Endogeneity. We discuss the problem of endogeneity in depth and explain conditions that engender it using examples grounded in the leadership literature. We show how consistent causal estimates can be derived from the randomized experiment, where endogeneity is eliminated by experimental design. We then review the reasons why estimates may become biased (i.e., inconsistent) in non-experimental designs and present a number of useful remedies for examining causal relations with non-experimental data. We write in intuitive terms using nontechnical language to make this chapter accessible to a large audience.
Resumo:
Federal and state policy makers increasingly emphasize the need to reduce highway crash rates. This emphasis is demonstrated in Iowa’s recently released draft Iowa Strategic Highway Safety Plan and by the U.S. Department of Transportation’s placement of “improved transportation safety” at the top of its list of strategic goals. Thus, finding improved methods to enhance highway safety has become a top priority at highway agencies. The objective of this project is to develop tools and procedures by which Iowa engineers can identify potentially hazardous roadway locations and designs, and to demonstrate the utility of these tools by developing candidate lists of high crash locations in the State. An initial task, building an integrated database to facilitate the tools and procedures, is an important product, in and of itself. Accordingly, the Iowa Department of Transportation (Iowa DOT) Geographic Information Management System (GIMS) and Geographic Information System Accident Analysis and Location System (GIS-ALAS) databases were integrated with available digital imagery. (The GIMS database contains roadway characteristics, e.g., lane width, surface and shoulder type, and traffic volume, for all public roadways. GIS-ALAS records include data, e.g., vehicles, drivers, roadway conditions, and the crash severity, for crashes occurring on public roadways during then past 10 years.)
Resumo:
This paper introduces Collage, a high-level IMS-LD compliant authoring tool that is specialized for CSCL (Computer-Supported Collaborative Learning). Nowadays CSCL is a key trend in elearning since it highlights the importance of social interactions as an essential element of learning. CSCL is an interdisciplinary domain, which demands participatory design techniques that allow teachers to get directly involved in design activities. Developing CSCL designs using LD is a difficult task for teachers since LD is a complex technical specification and modelling collaborative characteristics can be tricky. Collage helps teachers in the process of creating their own potentially effective collaborative Learning Designs by reusing and customizing patterns, according to the requirements of a particular learning situation. These patterns, called Collaborative Learning Flow Patterns (CLFPs), represent best practices that are repetitively used by practitioners when structuring the flow of (collaborative) learning activities. An example of an LD that can be created using Collage is illustrated in the paper. Preliminary evaluation results show that teachers, with experience in CL but without LD knowledge, can successfully design real collaborative learning experiences using Collage.
Resumo:
CSCL applications are complex distributed systems that posespecial requirements towards achieving success in educationalsettings. Flexible and efficient design of collaborative activitiesby educators is a key precondition in order to provide CSCL tailorable systems, capable of adapting to the needs of eachparticular learning environment. Furthermore, some parts ofthose CSCL systems should be reused as often as possible inorder to reduce development costs. In addition, it may be necessary to employ special hardware devices, computational resources that reside in other organizations, or even exceed thepossibilities of one specific organization. Therefore, theproposal of this paper is twofold: collecting collaborativelearning designs (scripting) provided by educators, based onwell-known best practices (collaborative learning flow patterns) in a standard way (IMS-LD) in order to guide the tailoring of CSCL systems by selecting and integrating reusable CSCL software units; and, implementing those units in the form of grid services offered by third party providers. More specifically, this paper outlines a grid-based CSCL system having these features and illustrates its potential scope and applicability by means of a sample collaborative learning scenario.
Resumo:
Collage is a pattern-based visual design authoring tool for the creation of collaborative learning scripts computationally modelled with IMS Learning Design (LD). The pattern-based visual approach aims to provide teachers with design ideas that are based on broadly accepted practices. Besides, it seeks hiding the LD notation so that teachers can easily create their own designs. The use of visual representations supports both the understanding of the design ideas and the usability of the authoring tool. This paper presents a multicase study comprising three different cases that evaluate the approach from different perspectives. The first case includes workshops where teachers use Collage. A second case implies the design of a scenario proposed by a third-party using related approaches. The third case analyzes a situation where students follow a design created with Collage. The cross-case analysis provides a global understanding of the possibilities and limitations of the pattern-based visual design approach.