810 resultados para Design Based Research


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is an urgent need for expanding the number of brain banks serving psychiatric research. We describe here the Psychiatric Disorders arm of the Brain Bank of the Brazilian Aging Brain Study Group (Psy-BBBABSG), which is focused in bipolar disorder (BD) and obsessive compulsive disorder (OCD). Our protocol was designed to minimize limitations faced by previous initiatives, and to enable design-based neurostereological analyses. The Psy-BBBABSG first milestone is the collection of 10 brains each of BD and OCD patients, and matched controls. The brains are sourced from a population-based autopsy service. The clinical and psychiatric assessments were done by an expert team including psychiatrists, through an informant. One hemisphere was perfused-fixed to render an optimal fixation for conducting neurostereological studies. The other hemisphere was comprehensively dissected and frozen for molecular studies. In 20 months, we collected 36 brains. A final report was completed for 14 cases: 3 BDs, 4 major depressive disorders, 1 substance use disorder, 1 mood disorder NOS, 3 obsessive compulsive spectrum symptoms, 1 OCD and 1 schizophrenia. The majority were male (64%), and the average age at death was 67.2 +/- 9.0 years. The average postmortem interval was 16 h. Three matched controls were collected. The pilot stage confirmed that the protocols are well fitted to reach our goals. Our unique autopsy source makes possible to collect a fairly number of high quality cases in a short time. Such a collection offers an additional to the international research community to advance the understanding on neuropsychiatric diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Whilst a fall in neuron numbers seems a common pattern during postnatal development, several authors have nonetheless reported an increase in neuron number, which may be associated with any one of a number of possible processes encapsulating either neurogenesis or late maturation and incomplete differentiation. Recent publications have thus added further fuel to the notion that a postnatal neurogenesis may indeed exist in sympathetic ganglia. In the light of these uncertainties surrounding the effects exerted by postnatal development on the number of superior cervical ganglion (SCG) neurons, we have used state-of-the-art design-based stereology to investigate the quantitative structure of SCG at four distinct timepoints after birth, viz., 1-3 days, 1 month, 12 months and 36 months. The main effects exerted by ageing on the SCG structure were: (i) a 77% increase in ganglion volume; (ii) stability in the total number of the whole population of SCG nerve cells (no change - either increase or decrease) during post-natal development; (iii) a higher proportion of uninucleate neurons to binucleate neurons only in newborn animals; (iv) a 130% increase in the volume of uninucleate cell bodies; and (v) the presence of BrdU positive neurons in animals at all ages. At the time of writing our results support the idea that neurogenesis takes place in the SCG of preas, albeit it warrants confirmation by further markers. We also hypothesise that a portfolio of other mechanisms: cell repair, maturation, differentiation and death may be equally intertwined and implicated in the numerical stability of SCG neurons during postnatal development. (C) 2011 ISDN. Published by Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Die Forschungsarbeit siedelt sich im Dreieck der Erziehungswissenschaften, der Informatik und der Schulpraxis an und besitzt somit einen starken interdisziplinären Charakter. Aus Sicht der Erziehungswissenschaften handelt es sich um ein Forschungsprojekt aus den Bereichen E-Learning und Multimedia Learning und der Fragestellung nach geeigneten Informatiksystemen für die Herstellung und den Austausch von digitalen, multimedialen und interaktiven Lernbausteinen. Dazu wurden zunächst methodisch-didaktische Vorteile digitaler Lerninhalte gegenüber klassischen Medien wie Buch und Papier zusammengetragen und mögliche Potentiale im Zusammenhang mit neuen Web2.0-Technologien aufgezeigt. Darauf aufbauend wurde für existierende Autorenwerkzeuge zur Herstellung digitaler Lernbausteine und bestehende Austauschplattformen analysiert, inwieweit diese bereits Web 2.0-Technologien unterstützen und nutzen. Aus Sicht der Informatik ergab sich aus der Analyse bestehender Systeme ein Anforderungsprofil für ein neues Autorenwerkzeug und eine neue Austauschplattform für digitale Lernbausteine. Das neue System wurde nach dem Ansatz des Design Science Research in einem iterativen Entwicklungsprozess in Form der Webapplikation LearningApps.org realisiert und stetig mit Lehrpersonen aus der Schulpraxis evaluiert. Bei der Entwicklung kamen aktuelle Web-Technologien zur Anwendung. Das Ergebnis der Forschungsarbeit ist ein produktives Informatiksystem, welches bereits von tausenden Nutzern in verschiedenen Ländern sowohl in Schulen als auch in der Wirtschaft eingesetzt wird. In einer empirischen Studie konnte das mit der Systementwicklung angestrebte Ziel, die Herstellung und den Austausch von digitalen Lernbausteinen zu vereinfachen, bestätigt werden. Aus Sicht der Schulpraxis liefert LearningApps.org einen Beitrag zur Methodenvielfalt und zur Nutzung von ICT im Unterricht. Die Ausrichtung des Werkzeugs auf mobile Endgeräte und 1:1-Computing entspricht dem allgemeinen Trend im Bildungswesen. Durch die Verknüpfung des Werkzeugs mit aktuellen Software Entwicklungen zur Herstellung von digitalen Schulbüchern werden auch Lehrmittelverlage als Zielgruppe angesprochen.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Il progetto di ricerca è finalizzato allo sviluppo di una metodologia innovativa di supporto decisionale nel processo di selezione tra alternative progettuali, basata su indicatori di prestazione. In particolare il lavoro si è focalizzato sulla definizione d’indicatori atti a supportare la decisione negli interventi di sbottigliamento di un impianto di processo. Sono stati sviluppati due indicatori, “bottleneck indicators”, che permettono di valutare la reale necessità dello sbottigliamento, individuando le cause che impediscono la produzione e lo sfruttamento delle apparecchiature. Questi sono stati validati attraverso l’applicazione all’analisi di un intervento su un impianto esistente e verificando che lo sfruttamento delle apparecchiature fosse correttamente individuato. Definita la necessità dell’intervento di sbottigliamento, è stato affrontato il problema della selezione tra alternative di processo possibili per realizzarlo. È stato applicato alla scelta un metodo basato su indicatori di sostenibilità che consente di confrontare le alternative considerando non solo il ritorno economico degli investimenti ma anche gli impatti su ambiente e sicurezza, e che è stato ulteriormente sviluppato in questa tesi. Sono stati definiti due indicatori, “area hazard indicators”, relativi alle emissioni fuggitive, per integrare questi aspetti nell’analisi della sostenibilità delle alternative. Per migliorare l’accuratezza nella quantificazione degli impatti è stato sviluppato un nuovo modello previsionale atto alla stima delle emissioni fuggitive di un impianto, basato unicamente sui dati disponibili in fase progettuale, che tiene conto delle tipologie di sorgenti emettitrici, dei loro meccanismi di perdita e della manutenzione. Validato mediante il confronto con dati sperimentali di un impianto produttivo, si è dimostrato che tale metodo è indispensabile per un corretto confronto delle alternative poiché i modelli esistenti sovrastimano eccessivamente le emissioni reali. Infine applicando gli indicatori ad un impianto esistente si è dimostrato che sono fondamentali per semplificare il processo decisionale, fornendo chiare e precise indicazioni impiegando un numero limitato di informazioni per ricavarle.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Community research fatigue has been understudied within the context of community-university relationships and knowledge production. Community-based research (CBR), often occurring within a limited geography and population, increases the possibility that community members feel exhausted or over-whelmed by university research —particularly when they do not see tangible results from research activities. Prompted by informal stories of research fatigue from community members, a small graduate student team sought to understand the extent to which community members experienced research fatigue, and what factors contributed to or relieved feelings of research fatigue. In order to explore these dimensions of research fatigue, semi-structured, face-to-face interviews were conducted with 21 participants, including community members (n = 9), staff and faculty (n = 10), and students (n = 2). The objective of the research was to identify university practices that contribute to research fatigue and how to address the issue at the university level. Qualitative data analysis revealed several important actionable findings: the structure and conduct of community-based research, structured reciprocity and impact, and the role of trust in research. This study’s findings are used to assess the quality of Clark University’s research relationship with its adjacent community. Recommendations are offered; such as to improve partnerships, the impact of CBR, and to develop clear principles of practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Surfactant protein D (SP-D) deficient mice develop emphysema-like pathology associated with focal accumulations of foamy alveolar macrophages, an excess of surfactant phospholipids in the alveolar space and both hypertrophy and hyperplasia of alveolar type II cells. These findings are associated with a chronic inflammatory state. Treatment of SP-D deficient mice with a truncated recombinant fragment of human SP-D (rfhSP-D) has been shown to decrease the lipidosis and alveolar macrophage accumulation as well as production of proinflammatory chemokines. The aim of this study was to investigate if rfhSP-D treatment reduces the structural abnormalities in parenchymal architecture and type II cells characteristic of SP-D deficiency. METHODS: SP-D knock-out mice, aged 3 weeks, 6 weeks and 9 weeks were treated with rfhSP-D for 9, 6 and 3 weeks, respectively. All mice were sacrificed at age 12 weeks and compared to both PBS treated SP-D deficient and wild-type groups. Lung structure was quantified by design-based stereology at the light and electron microscopic level. Emphasis was put on quantification of emphysema, type II cell changes and intracellular surfactant. Data were analysed with two sided non-parametric Mann-Whitney U-test. MAIN RESULTS: After 3 weeks of treatment, alveolar number was higher and mean alveolar size was smaller compared to saline-treated SP-D knock-out controls. There was no significant difference concerning these indices of pulmonary emphysema within rfhSP-D treated groups. Type II cell number and size were smaller as a consequence of treatment. The total volume of lamellar bodies per type II cell and per lung was smaller after 6 weeks of treatment. CONCLUSION: Treatment of SP-D deficient mice with rfhSP-D leads to a reduction in the degree of emphysema and a correction of type II cell hyperplasia and hypertrophy. This supports the concept that rfhSP-D might become a therapeutic option in diseases that are characterized by decreased SP-D levels in the lung.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Repeated bronchoalveolar lavage (BAL) has been used in animals to induce surfactant depletion and to study therapeutical interventions of subsequent respiratory insufficiency. Intratracheal administration of surface active agents such as perfluorocarbons (PFC) can prevent the alveolar collapse in surfactant depleted lungs. However, it is not known how BAL or subsequent PFC administration affect the intracellular and intraalveolar surfactant pool. METHODS: Male wistar rats were surfactant depleted by BAL and treated for 1 hour by conventional mechanical ventilation (Lavaged-Gas, n = 5) or partial liquid ventilation with PF 5080 (Lavaged-PF5080, n = 5). For control, 10 healthy animals with gas (Healthy-Gas, n = 5) or PF5080 filled lungs (Healthy-PF5080, n = 5) were studied. A design-based stereological approach was used for quantification of lung parenchyma and the intracellular and intraalveolar surfactant pool at the light and electron microscopic level. RESULTS: Compared to Healthy-lungs, Lavaged-animals had more type II cells with lamellar bodies in the process of secretion and freshly secreted lamellar body-like surfactant forms in the alveoli. The fraction of alveolar epithelial surface area covered with surfactant and total intraalveolar surfactant content were significantly smaller in Lavaged-animals. Compared with Gas-filled lungs, both PF5080-groups had a significantly higher total lung volume, but no other differences. CONCLUSION: After BAL-induced alveolar surfactant depletion the amount of intracellularly stored surfactant is about half as high as in healthy animals. In lavaged animals short time liquid ventilation with PF5080 did not alter intra- or extracellular surfactant content or subtype composition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: In clinical practice a diagnosis is based on a combination of clinical history, physical examination and additional diagnostic tests. At present, studies on diagnostic research often report the accuracy of tests without taking into account the information already known from history and examination. Due to this lack of information, together with variations in design and quality of studies, conventional meta-analyses based on these studies will not show the accuracy of the tests in real practice. By using individual patient data (IPD) to perform meta-analyses, the accuracy of tests can be assessed in relation to other patient characteristics and allows the development or evaluation of diagnostic algorithms for individual patients. In this study we will examine these potential benefits in four clinical diagnostic problems in the field of gynaecology, obstetrics and reproductive medicine. METHODS/DESIGN: Based on earlier systematic reviews for each of the four clinical problems, studies are considered for inclusion. The first authors of the included studies will be invited to participate and share their original data. After assessment of validity and completeness the acquired datasets are merged. Based on these data, a series of analyses will be performed, including a systematic comparison of the results of the IPD meta-analysis with those of a conventional meta-analysis, development of multivariable models for clinical history alone and for the combination of history, physical examination and relevant diagnostic tests and development of clinical prediction rules for the individual patients. These will be made accessible for clinicians. DISCUSSION: The use of IPD meta-analysis will allow evaluating accuracy of diagnostic tests in relation to other relevant information. Ultimately, this could increase the efficiency of the diagnostic work-up, e.g. by reducing the need for invasive tests and/or improving the accuracy of the diagnostic workup. This study will assess whether these benefits of IPD meta-analysis over conventional meta-analysis can be exploited and will provide a framework for future IPD meta-analyses in diagnostic and prognostic research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Airborne particles entering the respiratory tract may interact with the apical plasma membrane (APM) of epithelial cells and enter them. Differences in the entering mechanisms of fine (between 0.1 μm and 2.5 μm) and ultrafine ( ≤ 0.1 μm) particles may be associated with different effects on the APM. Therefore, we studied particle-induced changes in APM surface area in relation to applied and intracellular particle size, surface and number. Methods Human pulmonary epithelial cells (A549 cell line) were incubated with various concentrations of different sized fluorescent polystyrene spheres without surface charge (∅ fine – 1.062 μm, ultrafine – 0.041 μm) by submersed exposure for 24 h. APM surface area of A549 cells was estimated by design-based stereology and transmission electron microscopy. Intracellular particles were visualized and quantified by confocal laser scanning microscopy. Results Particle exposure induced an increase in APM surface area compared to negative control (p < 0.01) at the same surface area concentration of fine and ultrafine particles a finding not observed at low particle concentrations. Ultrafine particle entering was less pronounced than fine particle entering into epithelial cells, however, at the same particle surface area dose, the number of intracellular ultrafine particles was higher than that of fine particles. The number of intracellular particles showed a stronger increase for fine than for ultrafine particles at rising particle concentrations. Conclusion This study demonstrates a particle-induced enlargement of the APM surface area of a pulmonary epithelial cell line, depending on particle surface area dose. Particle uptake by epithelial cells does not seem to be responsible for this effect. We propose that direct interactions between particle surface area and cell membrane cause the enlargement of the APM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article outlines some of the issues involved in developing partnerships between service users, practitioners and researchers. It discusses these through some experience in Oslo as part of a national level agreement (HUSK) to improve social services in Norway through research and knowledge development. It begins with a review of the main concepts and debates involved in developing collaborative partnerships for practice-based research, particularly in the social services arena. The HUSK program is then described. The article then traces some specific developments and challenges in negotiating partnership relations as discussed by program participants (users, practitioners and researchers) in a series of workshops designed to elicit the issues directly from their experience.