900 resultados para new keynesian models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new concept in the therapy of both neoplastic and non-neoplastic diseases is discussed in this article. Photodynamic therapy (PDT) involves light activation, in the presence of molecular oxygen, of certain dyes that are taken up by the target tissue. These dyes are termed photosensitizers. The mechanism of interaction of the photosensitizers and light is discussed, along with the effects produced in the target tissue. The present status of clinical PDT is discussed along with the newer photosensitizers being used and their clinical roles. Despite the promising results from earlier clinical trials of PDT, considerable additional work is needed to bring this new modality of treatment into modern clinical practice. Improvements in the area of light source delivery, light dosimetry and the computation of models of treatment are necessary to standardize treatments and ensure proper treatment delivery. Finally, quality assurance issues in the treatment process should be introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linguistic modelling is a rather new branch of mathematics that is still undergoing rapid development. It is closely related to fuzzy set theory and fuzzy logic, but knowledge and experience from other fields of mathematics, as well as other fields of science including linguistics and behavioral sciences, is also necessary to build appropriate mathematical models. This topic has received considerable attention as it provides tools for mathematical representation of the most common means of human communication - natural language. Adding a natural language level to mathematical models can provide an interface between the mathematical representation of the modelled system and the user of the model - one that is sufficiently easy to use and understand, but yet conveys all the information necessary to avoid misinterpretations. It is, however, not a trivial task and the link between the linguistic and computational level of such models has to be established and maintained properly during the whole modelling process. In this thesis, we focus on the relationship between the linguistic and the mathematical level of decision support models. We discuss several important issues concerning the mathematical representation of meaning of linguistic expressions, their transformation into the language of mathematics and the retranslation of mathematical outputs back into natural language. In the first part of the thesis, our view of the linguistic modelling for decision support is presented and the main guidelines for building linguistic models for real-life decision support that are the basis of our modeling methodology are outlined. From the theoretical point of view, the issues of representation of meaning of linguistic terms, computations with these representations and the retranslation process back into the linguistic level (linguistic approximation) are studied in this part of the thesis. We focus on the reasonability of operations with the meanings of linguistic terms, the correspondence of the linguistic and mathematical level of the models and on proper presentation of appropriate outputs. We also discuss several issues concerning the ethical aspects of decision support - particularly the loss of meaning due to the transformation of mathematical outputs into natural language and the issue or responsibility for the final decisions. In the second part several case studies of real-life problems are presented. These provide background and necessary context and motivation for the mathematical results and models presented in this part. A linguistic decision support model for disaster management is presented here – formulated as a fuzzy linear programming problem and a heuristic solution to it is proposed. Uncertainty of outputs, expert knowledge concerning disaster response practice and the necessity of obtaining outputs that are easy to interpret (and available in very short time) are reflected in the design of the model. Saaty’s analytic hierarchy process (AHP) is considered in two case studies - first in the context of the evaluation of works of art, where a weak consistency condition is introduced and an adaptation of AHP for large matrices of preference intensities is presented. The second AHP case-study deals with the fuzzified version of AHP and its use for evaluation purposes – particularly the integration of peer-review into the evaluation of R&D outputs is considered. In the context of HR management, we present a fuzzy rule based evaluation model (academic faculty evaluation is considered) constructed to provide outputs that do not require linguistic approximation and are easily transformed into graphical information. This is achieved by designing a specific form of fuzzy inference. Finally the last case study is from the area of humanities - psychological diagnostics is considered and a linguistic fuzzy model for the interpretation of outputs of multidimensional questionnaires is suggested. The issue of the quality of data in mathematical classification models is also studied here. A modification of the receiver operating characteristics (ROC) method is presented to reflect variable quality of data instances in the validation set during classifier performance assessment. Twelve publications on which the author participated are appended as a third part of this thesis. These summarize the mathematical results and provide a closer insight into the issues of the practicalapplications that are considered in the second part of the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this Master’s thesis was to study the business model development in Finnish newspaper industry during the next then years through scenario planning. The objective was to see how will the business models develop amidst the many changes in the industry, what factors are affecting the change, what are the implications of these changes for the players in the industry and how should the Finnish newspaper companies evolve in order to succeed in the future. In this thesis the business model change is studied based on all the elements of business models, as it was discovered that the industry is too often focusing on changes in only few of those elements and a more broader view can provide valuable information for the companies. The results revealed that the industry is affected by many changes during the next ten years. Scenario planning provides a good tool for analyzing this change and for developing valuable options for businesses. After conducting series of interviews and discovering forces affecting the change, four different scenarios were developed centered on the role that newspaper will take and the level at which they are providing the content in the future. These scenarios indicated that there are varieties of options in the way the business models may develop and that companies should start making decisions proactively in order to succeed. As the business model elements are interdepended, changes made in the other elements will affect the whole model, making these decisions about the role and level of content important for the companies. In the future, it is likely that the Finnish newspaper industry will include many different kinds of business models, some of which can be drastically different from the current ones and some of which can still be similar, but take better into account the new kind of media environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer’s disease (AD) is the most common form of dementia. Characteristic changes in an AD brain are the formation of β-amyloid protein (Aβ) plaques and neurofibrillary tangles, though other alterations in the brain have also been connected to AD. No cure is available for AD and it is one of the leading causes of death among the elderly in developed countries. Liposomes are biocompatible and biodegradable spherical phospholipid bilayer vesicles that can enclose various compounds. Several functional groups can be attached on the surface of liposomes in order to achieve long-circulating target-specific liposomes. Liposomes can be utilized as drug carriers and vehicles for imaging agents. Positron emission tomography (PET) is a non-invasive imaging method to study biological processes in living organisms. In this study using nucleophilic 18F-labeling synthesis, various synthesis approaches and leaving groups for novel PET imaging tracers have been developed to target AD pathology in the brain. The tracers were the thioflavin derivative [18F]flutemetamol, curcumin derivative [18F]treg-curcumin, and functionalized [18F]nanoliposomes, which all target Aβ in the AD brain. These tracers were evaluated using transgenic AD mouse models. In addition, 18F-labeling synthesis was developed for a tracer targeting the S1P3 receptor. The chosen 18F-fluorination strategy had an effect on the radiochemical yield and specific activity of the tracers. [18F]Treg-curcumin and functionalized [18F]nanoliposomes had low uptake in AD mouse brain, whereas [18F]flutemetamol exhibited the appropriate properties for preclinical Aβ-imaging. All of these tracers can be utilized in studies of the pathology and treatment of AD and related diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dopamine constitutes about 80% of the content of central catecholamines and has a crucial role in the etiology of several neuropsychiatric disorders, including Parkinson's disease, depression and schizophrenia. Several dopaminergic drugs are used to treat these pathologies, but many problems are attributed to these therapies. Within this context, the search for new more efficient dopaminergic agents with less adverse effects represents a vast research field. The aim of the present study was to report the structural design of two N-phenylpiperazine derivatives, compound 4: 1-[1-(4-chlorophenyl)-1H-4-pyrazolylmethyl]-4-phenylhexahydropyrazine and compound 5: 1-[1-(4-chlorophenyl)-1H-1,2,3-triazol-4-ylmethyl]-4-phenylhexahydropyrazine, planned to be dopamine ligands, and their dopaminergic action profile. The two compounds were assayed (dose range of 15-40 mg/kg) in three experimental models: 1) blockade of amphetamine (30 mg/kg, ip)-induced stereotypy in rats; 2) the catalepsy test in mice, and 3) apomorphine (1 mg/kg, ip)-induced hypothermia in mice. Both derivatives induced cataleptic behavior (40 mg/kg, ip) and a hypothermic response (30 mg/kg, ip) which was not prevented by haloperidol (0.5 mg/kg, ip). Compound 5 (30 mg/kg, ip) also presented a synergistic hypothermic effect with apomorphine (1 mg/kg, ip). Only compound 4 (30 mg/kg, ip) significantly blocked the amphetamine-induced stereotypy in rats. The N-phenylpiperazine derivatives 4 and 5 seem to have a peculiar profile of action on dopaminergic functions. On the basis of the results of catalepsy and amphetamine-induced stereotypy, the compounds demonstrated an inhibitory effect on dopaminergic behaviors. However, their hypothermic effect is compatible with the stimulation of dopaminergic function which seems not to be mediated by D2/D3 receptors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new area of machine learning research called deep learning, has moved machine learning closer to one of its original goals: artificial intelligence and general learning algorithm. The key idea is to pretrain models in completely unsupervised way and finally they can be fine-tuned for the task at hand using supervised learning. In this thesis, a general introduction to deep learning models and algorithms are given and these methods are applied to facial keypoints detection. The task is to predict the positions of 15 keypoints on grayscale face images. Each predicted keypoint is specified by an (x,y) real-valued pair in the space of pixel indices. In experiments, we pretrained deep belief networks (DBN) and finally performed a discriminative fine-tuning. We varied the depth and size of an architecture. We tested both deterministic and sampled hidden activations and the effect of additional unlabeled data on pretraining. The experimental results show that our model provides better results than publicly available benchmarks for the dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitric oxide (NO) donors produce NO-related activity when applied to biological systems. Among its diverse functions, NO has been implicated in vascular smooth muscle relaxation. Despite the great importance of NO in biological systems, its pharmacological and physiological studies have been limited due to its high reactivity and short half-life. In this review we will focus on our recent investigations of nitrosyl ruthenium complexes as NO-delivery agents and their effects on vascular smooth muscle cell relaxation. The high affinity of ruthenium for NO is a marked feature of its chemistry. The main signaling pathway responsible for the vascular relaxation induced by NO involves the activation of soluble guanylyl-cyclase, with subsequent accumulation of cGMP and activation of cGMP-dependent protein kinase. This in turn can activate several proteins such as K+ channels as well as induce vasodilatation by a decrease in cytosolic Ca2+. Oxidative stress and associated oxidative damage are mediators of vascular damage in several cardiovascular diseases, including hypertension. The increased production of the superoxide anion (O2-) by the vascular wall has been observed in different animal models of hypertension. Vascular relaxation to the endogenous NO-related response or to NO released from NO deliverers is impaired in vessels from renal hypertensive (2K-1C) rats. A growing amount of evidence supports the possibility that increased NO inactivation by excess O2- may account for the decreased NO bioavailability and vascular dysfunction in hypertension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mortality rate of older patients with intertrochanteric fractures has been increasing with the aging of populations in China. The purpose of this study was: 1) to develop an artificial neural network (ANN) using clinical information to predict the 1-year mortality of elderly patients with intertrochanteric fractures, and 2) to compare the ANN's predictive ability with that of logistic regression models. The ANN model was tested against actual outcomes of an intertrochanteric femoral fracture database in China. The ANN model was generated with eight clinical inputs and a single output. ANN's performance was compared with a logistic regression model created with the same inputs in terms of accuracy, sensitivity, specificity, and discriminability. The study population was composed of 2150 patients (679 males and 1471 females): 1432 in the training group and 718 new patients in the testing group. The ANN model that had eight neurons in the hidden layer had the highest accuracies among the four ANN models: 92.46 and 85.79% in both training and testing datasets, respectively. The areas under the receiver operating characteristic curves of the automatically selected ANN model for both datasets were 0.901 (95%CI=0.814-0.988) and 0.869 (95%CI=0.748-0.990), higher than the 0.745 (95%CI=0.612-0.879) and 0.728 (95%CI=0.595-0.862) of the logistic regression model. The ANN model can be used for predicting 1-year mortality in elderly patients with intertrochanteric fractures. It outperformed a logistic regression on multiple performance measures when given the same variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The costs of health care are going up in many countries. In order to provide affordable and effective health care solutions, new technologies and approaches are constantly being developed. In this research, video games are presented as a possible solution to the problem. Video games are fun, and nowadays most people like to spend time on them. In addition, recent studies have pointed out that video games can have notable health benefits. Health games have already been developed, used in practice, and researched. However, the bulk of health game studies have been concerned with the design or the effectiveness of the games; no actual business studies have been conducted on the subject, even though health games often lack commercial success despite their health benefits. This thesis seeks to fill this gap. The specific aim of this thesis is to develop a conceptual business model framework and empirically use it in explorative medical game business model research. In the first stage of this research, a literature review was conducted and the existing literature analyzed and synthesized into a conceptual business model framework consisting of six dimensions. The motivation behind the synthesis is the ongoing ambiguity around the business model concept. In the second stage, 22 semi-structured interviews were conducted with different professionals within the value network for medical games. The business model framework was present in all stages of the empirical research: First, in the data collection stage, the framework acted as a guiding instrument, focusing the interview process. Then, the interviews were coded and analyzed using the framework as a structure. The results were then reported following the structure of the framework. In the results, the interviewees highlighted several important considerations and issues for medical games concerning the six dimensions of the business model framework. Based on the key findings of this research, several key components of business models for medical games were identified and illustrated in a single figure. Furthermore, five notable challenges for business models for medical games were presented, and possible solutions for the challenges were postulated. Theoretically, these findings provide pioneering information on the untouched subject of business models for medical games. Moreover, the conceptual business model framework and its use in the novel context of medical games provide a contribution to the business model literature. Regarding practice, this thesis further accentuates that medical games can offer notable benefits to several stakeholder groups and offers advice to companies seeking to commercialize these games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restructuring by adding Sodium Alginate or Microbial Transglutaminase (MTGase) using cold gelation technology make it possible to obtain many different raw products from minced and/or chopped fish muscle that are suitable for being used as the basis of new restructured products with different physicochemical properties and even different compositions. Special consideration must be given to their shelf-life and the changes that may take place during chilling, both in visual appearance and physicochemical properties. After chilled storage, the restructured models made with different muscular particle size and composition at low temperature (5 °C), it was observed that microbial growth limited the shelf-life to 7-14 days. Mechanical properties increased (p < 0.05) during that time, and higher values were observed in samples elaborated by joining small muscle particle size than in those elaborated by homogenization. There was no clear increase in the cooking yield and purge loss, and no significant colour change (p > 0.05) was detected during storage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concentrates on developing a suitable business model for Finnish biobanks, with particular emphasis on value creation to stakeholders. The sub-objective of this thesis are to map the commercial possibilities of biobanks and potential barriers for business development. The study approaches the subject from the biobanks’ as well as the stakeholders’ point of view, integrating their hopes and needs considering current and future co-operation into the findings. In 2013 the Biobank Act came into effect, after which six biobanks have been established and several other pending biobank projects are in process. There is relatively little research in regard to the commercial opportunities of this newcomer of the biomedical industry, and particularly in the Finnish markets. Therefore, the aim of this study is to partially fill the research gap of the commercial potential of biobanks and particularly outline the problematic elements in developing business. The theoretical framework consists of a few select theories, which depict business modeling and value creation of organizations. The theories are combined to form a synthesis, which best adapts to biobanks, and acts as a backbone for interviews. The empirical part of the study was conducted mainly by seven face-to-face interviews, and complemented by two phone interviews and an e-mail questionnaire with four responses. The findings consist mainly of the participants’ reflections on the potential products and services enabled by consumer genomics, as well as perceptions on different obstacles for biobanks’ business development. The nature of the study is tentative, as biobanks are relatively new organizations in Finland, and their operation models and activities are still molding. The aim is to bring to surface the hopes and concerns of biobanks’ representatives, as well as the representatives of stakeholders, in order to transparently discuss the current situation and suggestions for further development. The study concludes that in principle, the interviewees’ agree on the need for development in order not to waste the potential of biobanks; regardless, the participants emphasize different aspects and subsequently lean on differing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business plans are made when establishing new company or when organizations launch new product or services. In this Master Thesis was examined the elements are included in the business plan and emphasized. Business plan is a wide document and can also contain company specific information, the literature review was restricted into three areas which were investigated from the relating literature and articles. The selected areas were Market Segmentation and Targeting, Competitive Environment, and Market Positioning and Strategy. The different business plan models were investigated by interviewing companies who operates in a different industry sectors from each other’s. The models were compared to each other and to the findings from literature. Based on interview results and literature findings, the business plan for fibre based packaging. The created business plan contains three selected areas. It was found that the selected business plan elements can be found from the interviewed companies’ business plans. The market segmentation was done by comparing the market share to known total market size. When analyzing the competitive environment, there was no one selected model in use. The tools to evaluate competitive environment was selected parts from both SWOT analysis and Porter’s five forces model in applicable part. Based on interview results, it can be state that the company or organization should find and built its own model for business plans. In order to receive the benefits for future planning, the company should use the same model for long time.