44 resultados para Bayesian approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main focus of the present thesis was at verbal episodic memory processes that are particularly vulnerable to preclinical and clinical Alzheimer’s disease (AD). Here these processes were studied by a word learning paradigm, cutting across the domains of memory and language learning studies. Moreover, the differentiation between normal aging, mild cognitive impairment (MCI) and AD was studied by the cognitive screening test CERAD. In study I, the aim was to examine how patients with amnestic MCI differ from healthy controls in the different CERAD subtests. Also, the sensitivity and specificity of the CERAD screening test to MCI and AD was examined, as previous studies on the sensitivity and specificity of the CERAD have not included MCI patients. The results indicated that MCI is characterized by an encoding deficit, as shown by the overall worse performance on the CERAD Wordlist learning test compared with controls. As a screening test, CERAD was not very sensitive to MCI. In study II, verbal learning and forgetting in amnestic MCI, AD and healthy elderly controls was investigated with an experimental word learning paradigm, where names of 40 unfamiliar objects (mainly archaic tools) were trained with or without semantic support. The object names were trained during a 4-day long period and a follow-up was conducted one week, 4 weeks and 8 weeks after the training period. Manipulation of semantic support was included in the paradigm because it was hypothesized that semantic support might have some beneficial effects in the present learning task especially for the MCI group, as semantic memory is quite well preserved in MCI in contrast to episodic memory. We found that word learning was significantly impaired in MCI and AD patients, whereas forgetting patterns were similar across groups. Semantic support showed a beneficial effect on object name retrieval in the MCI group 8 weeks after training, indicating that the MCI patients’ preserved semantic memory abilities compensated for their impaired episodic memory. The MCI group performed equally well as the controls in the tasks tapping incidental learning and recognition memory, whereas the AD group showed impairment. Both the MCI and the AD group benefited less from phonological cueing than the controls. Our findings indicate that acquisition is compromised in both MCI and AD, whereas long13 term retention is not affected to the same extent. Incidental learning and recognition memory seem to be well preserved in MCI. In studies III and IV, the neural correlates of naming newly learned objects were examined in healthy elderly subjects and in amnestic MCI patients by means of positron emission tomography (PET) right after the training period. The naming of newly learned objects by healthy elderly subjects recruited a left-lateralized network, including frontotemporal regions and the cerebellum, which was more extensive than the one related to the naming of familiar objects (study III). Semantic support showed no effects on the PET results for the healthy subjects. The observed activation increases may reflect lexicalsemantic and lexical-phonological retrieval, as well as more general associative memory mechanisms. In study IV, compared to the controls, the MCI patients showed increased anterior cingulate activation when naming newly learned objects that had been learned without semantic support. This suggests a recruitment of additional executive and attentional resources in the MCI group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The currently used forms of cancer therapy are associated with drug resistance and toxicity to healthy tissues. Thus, more efficient methods are needed for cancer-specific induction of growth arrest and programmed cell death, also known as apoptosis. Therapeutic forms of tumor necrosis factor-related apoptosis-inducing ligand (TRAIL) are investigated in clinical trials due to the capability of TRAIL to trigger apoptosis specifically in cancer cells by activation of cell surface death receptors. Many tumors, however, have acquired resistance to TRAIL-induced apoptosis and sensitizing drugs for combinatorial treatments are, therefore, in high demand. This study demonstrates that lignans, natural polyphenols enriched in seeds and cereal, have a remarkable sensitizing effect on TRAIL-induced cell death at non-toxic lignan concentrations. In TRAIL-resistant and androgen-dependent prostate cancer cells we observe that lignans repress receptor tyrosine kinase (RTK) activity and downregulate cell survival signaling via the Akt pathway, which leads to increased TRAIL sensitivity. A structure-activity relationship analysis reveals that the γ-butyrolactone ring of the dibenzylbutyrolactone lignans is essential for the rapidly reversible TRAIL-sensitizing activity of these compounds. Furthermore, the lignan nortrachelogenin (NTG) is identified as the most efficient of the 27 tested lignans and norlignans in sensitization of androgen-deprived prostate cancer cells to TRAIL-induced apoptosis. While this combinatorial anticancer approach may leave normal cells unharmed, several efficient cancer drugs are too toxic, insoluble or unstable to be used in systemic therapy. To enable use of such drugs and to protect normal cells from cytotoxic effects, cancer-targeted drug delivery vehicles of nanometer scale have recently been generated. The newly developed nanoparticle system that we tested in vitro for cancer cell targeting combines the efficient drug-loading capacity of mesoporous silica to the versatile particle surface functionalization of hyperbranched poly(ethylene imine), PEI. The mesoporous hybrid silica nanoparticles (MSNs) were functionalized with folic acid to promote targeted internalization by folate receptor overexpressing cancer cells. The presented results demonstrate that the developed carrier system can be employed in vitro for cancer selective delivery of adsorbed or covalently conjugated molecules and furthermore, for selective induction of apoptotic cell death in folate receptor expressing cancer cells. The tested carrier system displays potential for simultaneous delivery of several anticancer agents specifically to cancer cells also in vivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Supply chain risk management has emerged as an increasingly important issue in logistics as disruptions in the supply chain have become critical issues for many companies. The scientific literature on the subject is developing and in many respects the understanding of it is still in its infancy. Thus, there is a need for more information in order for scholars and practitioners to understand the causalities and interrelations that characterise the phenomenon. The aim of this dissertation is to narrow this gap by exploring key aspects of supply chain risk management through two maritime supply chains in the immediate region of the Gulf of Finland. The study contributes to the field in three different ways. Firstly, it facilitates the identification of risks on different levels of the supply chain through a systematic analysis of the processes and actors, and of the cognitive barriers that limit the actors’ visibility and their understanding of the operations and the risks involved. There is a clear need to increase collaboration and information exchange in order to improve visibility in the chain. Risk management should be a collaborative effort among the individual actors, aimed at obtaining a holistic picture. Secondly, the study contributes to the literature on risk analysis through the use of systemic frameworks that illustrate the causalities and linkages in the system, thereby making it easier to perceive the vulnerabilities. Thirdly, the study enhances current knowledge of risk control in identifying actor roles, risk visibility and risk controllability as being among the key factors determining risk-management effectiveness against supply-chain vulnerability. This dissertation is divided into two parts. The first part gives a general overview of the relevant literature, the research design and the conclusions of the study, and the second part comprises six research publications. Case-study methodology with systematic combining approach is used, where in-depth interviews, questionnaires and expert panel sessions are the main data collection methods. The study illustrates the current state of risk management in multimodal maritime supply chains, and develops frameworks for further analysis. The results imply that there are major differences between organizations in their ability to execute supply chain risk management. Further collaboration should be considered in order to facilitate the development of systematic and effective management processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Customer satisfaction has been widely studied concept due to its importance on business performance. Customer satisfaction should ideally lead to customer loyalty and have a positive effect on business profitability and growth. This study investigates customer satisfaction and loyalty in the Do-It-Yourself retailing in Russian  market.  “K-rauta”  retail  chain  was  chosen  as  a  focus company for this study. Goal of the study was to investigate what creates customer satisfaction in this given market and what is the role of quality, trust and satisfaction for creating customer loyalty. The role of internet in consumer purchasing process was also investigated. Furthermore, consumer preferences towards new marketing solutions such as smart phone applications were briefly examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

T helper (Th) cells are vital regulators of the adaptive immune system. When activated by presentation of cognate antigen, Th cells demonstrate capacity to differentiate into functionally distinct effector cell subsets. The Th2 subset is required for protection against extracellular parasites, such as helminths, but is also closely linked to pathogenesis of asthma and allergies. The intracellular molecular signal transduction pathways regulating T helper cell subset differentiation are still incompletely known. Moreover, great majority of studies regarding Th2 differentiation have been conducted with mice models, while studies with human cells have been fewer in comparison. The goal of this thesis was to characterize molecular mechanisms promoting the development of Th2 phenotype, focusing specifically on human umbilical cord blood T cells as an experimental model. These primary cells, activated and differentiated to Th2 cells in vitro, were investigated by complementary system-wide approaches, targeting levels of mRNA, proteins, and lipid molecules. Specifically, the results indicated IL4-regulated recruitment of nuclear protein, and described novel components of the Th2-promoting STAT6 enhanceosome complex. Furthermore, the development of the activated effector cell phenotype was found to correlate with remodeling of the cellular lipidome. These findings will hopefully advance the understanding of human Th2 cell lineage commitment and development of Th2-associated disease states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this thesis is twofold. The first and major part is devoted to sensitivity analysis of various discrete optimization problems while the second part addresses methods applied for calculating measures of solution stability and solving multicriteria discrete optimization problems. Despite numerous approaches to stability analysis of discrete optimization problems two major directions can be single out: quantitative and qualitative. Qualitative sensitivity analysis is conducted for multicriteria discrete optimization problems with minisum, minimax and minimin partial criteria. The main results obtained here are necessary and sufficient conditions for different stability types of optimal solutions (or a set of optimal solutions) of the considered problems. Within the framework of quantitative direction various measures of solution stability are investigated. A formula for a quantitative characteristic called stability radius is obtained for the generalized equilibrium situation invariant to changes of game parameters in the case of the H¨older metric. Quality of the problem solution can also be described in terms of robustness analysis. In this work the concepts of accuracy and robustness tolerances are presented for a strategic game with a finite number of players where initial coefficients (costs) of linear payoff functions are subject to perturbations. Investigation of stability radius also aims to devise methods for its calculation. A new metaheuristic approach is derived for calculation of stability radius of an optimal solution to the shortest path problem. The main advantage of the developed method is that it can be potentially applicable for calculating stability radii of NP-hard problems. The last chapter of the thesis focuses on deriving innovative methods based on interactive optimization approach for solving multicriteria combinatorial optimization problems. The key idea of the proposed approach is to utilize a parameterized achievement scalarizing function for solution calculation and to direct interactive procedure by changing weighting coefficients of this function. In order to illustrate the introduced ideas a decision making process is simulated for three objective median location problem. The concepts, models, and ideas collected and analyzed in this thesis create a good and relevant grounds for developing more complicated and integrated models of postoptimal analysis and solving the most computationally challenging problems related to it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Med prediktion avses att man skattar det framtida värdet på en observerbar storhet. Kännetecknande för det bayesianska paradigmet är att osäkerhet gällande okända storheter uttrycks i form av sannolikheter. En bayesiansk prediktiv modell är således en sannolikhetsfördelning över de möjliga värden som en observerbar, men ännu inte observerad storhet kan anta. I de artiklar som ingår i avhandlingen utvecklas metoder, vilka bl.a. tillämpas i analys av kromatografiska data i brottsutredningar. Med undantag för den första artikeln, bygger samtliga metoder på bayesiansk prediktiv modellering. I artiklarna betraktas i huvudsak tre olika typer av problem relaterade till kromatografiska data: kvantifiering, parvis matchning och klustring. I den första artikeln utvecklas en icke-parametrisk modell för mätfel av kromatografiska analyser av alkoholhalt i blodet. I den andra artikeln utvecklas en prediktiv inferensmetod för jämförelse av två stickprov. Metoden tillämpas i den tredje artik eln för jämförelse av oljeprover i syfte att kunna identifiera den förorenande källan i samband med oljeutsläpp. I den fjärde artikeln härleds en prediktiv modell för klustring av data av blandad diskret och kontinuerlig typ, vilken bl.a. tillämpas i klassificering av amfetaminprover med avseende på produktionsomgångar.