845 resultados para declarative, procedural, and reflective (DPR) model
Resumo:
Iron homeostasis dysregulation has been regarded as an important mechanism in neurodegenerative diseases. The H63D and C282Y polymorphisms in theHFE gene may be involved in the development of sporadic amyotrophic lateral sclerosis (ALS) through the disruption of iron homeostasis. However, studies investigating the relationship between ALS and these two polymorphisms have yielded contradictory outcomes. We performed a meta-analysis to assess the roles of the H63D and C282Y polymorphisms of HFEin ALS susceptibility. PubMed, MEDLINE, EMBASE, and Cochrane Library databases were systematically searched to identify relevant studies. Strict selection criteria and exclusion criteria were applied. Odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of associations. A fixed- or random-effect model was selected, depending on the results of the heterogeneity test. Fourteen studies were included in the meta-analysis (six studies with 1692 cases and 8359 controls for C282Y; 14 studies with 5849 cases and 13,710 controls for H63D). For the C282Y polymorphism, significant associations were observed in the allele model (Y vs C: OR=0.76, 95%CI=0.62-0.92, P=0.005) and the dominant model (YY+CYvs CC: OR=0.75, 95%CI=0.61-0.92, P=0.006). No associations were found for any genetic model for the H63D polymorphism. The C282Y polymorphism in HFE could be a potential protective factor for ALS in Caucasians. However, the H63D polymorphism does not appear to be associated with ALS.
Resumo:
Exposure to nitrogen oxides (NOx) emitted by burning fossil fuels has been associated with respiratory diseases. We aimed to estimate the effects of NOx exposure on mortality owing to respiratory diseases in residents of Taubaté, São Paulo, Brazil, of all ages and both sexes. This time-series ecological study from August 1, 2011 to July 31, 2012 used information on deaths caused by respiratory diseases obtained from the Health Department of Taubaté. Estimated daily levels of pollutants (NOx, particulate matter, ozone, carbon monoxide) were obtained from the Centro de Previsão de Tempo e Estudos Climáticos Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System. These environmental variables were used to adjust the multipollutant model for apparent temperature. To estimate association between hospitalizations owing to asthma and air pollutants, generalized additive Poisson regression models were developed, with lags as much as 5 days. There were 385 deaths with a daily mean (±SD) of 1.05±1.03 (range: 0-5). Exposure to NOx was significantly associated with mortality owing to respiratory diseases: relative risk (RR)=1.035 (95% confidence interval [CI]: 1.008-1.063) for lag 2, RR=1.064 (95%CI: 1.017-1.112) lag 3, RR=1.055 (95%CI: 1.025-1.085) lag 4, and RR=1.042 (95%CI: 1.010-1.076) lag 5. A 3 µg/m3 reduction in NOx concentration resulted in a decrease of 10-18 percentage points in risk of death caused by respiratory diseases. Even at NOx concentrations below the acceptable standard, there is association with deaths caused by respiratory diseases.
Resumo:
Digitalization has been predicted to change the future as a growing range of non-routine tasks will be automated, offering new kinds of business models for enterprises. Serviceoriented architecture (SOA) provides a basis for designing and implementing welldefined problems as reusable services, allowing computers to execute them. Serviceoriented design has potential to act as a mediator between IT and human resources, but enterprises struggle with their SOA adoption and lack a linkage between the benefits and costs of services. This thesis studies the phenomenon of service reuse in enterprises, proposing an ontology to link different kinds of services with their role conceptually as a part of the business model. The proposed ontology has been created on the basis of qualitative research conducted in three large enterprises. Service reuse has two roles in enterprises: it enables automated data sharing among human and IT resources, and it may provide cost savings in service development and operations. From a technical viewpoint, the ability to define a business problem as a service is one of the key enablers for achieving service reuse. The research proposes two service identification methods, first to identify prospective services in the existing documentation of the enterprise and secondly to model the services from a functional viewpoint, supporting service identification sessions with business stakeholders.
Resumo:
The sorption behavior of dry products is generally affected by the drying method. The sorption isotherms are useful to determine and compare thermodynamic properties of passion fruit pulp powder processed by different drying methods. The objective of this study is to analyze the effects of different drying methods on the sorption properties of passion fruit pulp powder. Passion fruit pulp powder was dehydrated using different dryers: vacuum, spray dryer, vibro-fluidized, and freeze dryer. The moisture equilibrium data of Passion Fruit Pulp (PFP) powders with 55% of maltodextrin (MD) were determined at 20, 30, 40 and 50 ºC. The behavior of the curves was type III, according to Brunauer's classification, and the GAB model was fitted to the experimental equilibrium data. The equilibrium moisture contents of the samples were little affected by temperature variation. The spray dryer provides a dry product with higher adsorption capacity than that of the other methods. The vibro-fluidized bed drying showed higher adsorption capacity than that of vacuum and freeze drying. The vacuum and freeze drying presented the same adsorption capacity. The isosteric heats of sorption were found to decrease with increasing moisture content. Considering the effect of drying methods, the highest isosteric heat of sorption was observed for powders produced by spray drying, whereas powders obtained by vacuum and freeze drying showed the lowest isosteric heats of sorption.
Resumo:
The Failure Mode and Effect Analysis (FMEA) was applied for risk assessment of confectionary manufacturing, in whichthe traditional methods and equipment were intensively used in the production. Potential failure modes and effects as well as their possible causes were identified in the process flow. Processing stages that involve intensive handling of food by workers had the highest risk priority numbers (RPN = 216 and 189), followed by chemical contamination risks in different stages of the process. The application of corrective actions substantially reduced the RPN (risk priority number) values. Therefore, the implementation of FMEA (The Failure Mode and Effect Analysis) model in confectionary manufacturing improved the safety and quality of the final products.
Resumo:
This study is based on a large survey study of over 1500 Finnish companies’ usage, needs and implementation difficulties of management accounting systems. The study uses quantitative, qualitative and mixed methods to answer the research questions. The empirical data used in the study was gathered through structured interviews with randomly selected companies of varying sizes and industries. The study answers the three research questions by analyzing the characteristics and behaviors of companies working in Finland. The study found five distinctive groups of companies according to the characteristics of their cost information and management accounting system use. The study also showed that the state of cost information and management accounting systems depends on the industry and size of the companies. It was found that over 50% of the companies either did not know how their systems could be updated or saw systems as inadequate. The qualitative side also highlighted the needs for tailored and integrated management accounting systems for creating more value to the managers of companies. The major inhibitors of new system implementation were the lack of both monetary and human resources. Through the use of mixed methods and design science a new and improved sophistication model is created based on previous research results combined with the information gathered from previous literature. The sophistication model shows the different stages of management accounting systems in use and what companies can achieve with the implementation and upgrading of their systems.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
The aim of this thesis is to define effects of lignin separation process on Pulp mill chemical balance especially on sodium/sulphur-balance. The objective is to develop a simulation model with WinGEMS Process Simulator and use that model to simulate the chemical balances and process changes. The literature part explains what lignin is and how kraft pulp is produced. It also introduces to the methods that can be used to extract lignin from black liquor stream and how those methods affect the pulping process. In experimental part seven different cases are simulated with the created simulation model. The simulations are based on selected reference mill that produces 500 000 tons of bleached air-dried (90 %) pulp per year. The simulations include the chemical balance calculation and the estimated production increase. Based on the simulations the heat load of the recovery boiler can be reduced and the pulp production increased when lignin is extracted. The simulations showed that decreasing the waste acid stream intake from the chlorine dioxide plant is an effective method to control the sulphidity level when about 10 % of lignin is extracted. With higher lignin removal rates the in-mill sulphuric acid production has been discovered to be a better alternative to the sulphidity control.
Resumo:
The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
This paper first presents some basic ideas and models of a structuralist development macroeconomics that complements and actualizes the ideas of the structuralist development economics that was dominant between the 1940s and the 1960s. A system of three models focusing on the exchange rate (the tendency to the cyclical overvaluation of the exchange rate, a critique of growth with foreign savings, and new a model of the Dutch disease) shows that it is not just volatile but chronically overvalued, and for that reason it is not just a macroeconomic problem; as a long term disequilibrium, it is in the core of development economics. Second, it summarizes "new developmentalism" - a sum of growth policies based on these models and on the experience of fast-growing Asian countries.
Resumo:
The Finnish legislation requires for a safe and secure learning environment. However, the comprehensive, risk based safety and security management (SSM) and the management commitment in the implementation and development of the SSM are not mentioned in the legislation. Multiple institutions, operators and researchers have studied and developed safety and security in educational institutions over the past decade. Typically the approach has been fragmented and without bringing up the importance of the comprehensive SSM. The development needs of the safety and security operations in universities have been studied. However, in universities of applied sciences (UASs) and in elementary schools (ESs), the performance level, strengths and weaknesses of the comprehensive SSM have not been studied. The objective of this study was to develop the comprehensive, risk based SSM of educational institutions by developing the new Asteri consultative auditing process and study its effects on auditees. Furthermore, the performance level in the comprehensive SSM in UASs and ESs were studied using Asteri and the TUTOR model developed by the Keski-Uusimaa Department for Rescue Services. In addition, strengths, development needs and differences were identified. In total, 76 educational institutions were audited between the years 2011 and 2014. The study is based on logical empiricism, and an observational applied research design was used. Auditing, observation and an electronic survey were used for data collection. Statistical analysis was used to analyze the collected information. In addition, thematic analysis was used to analyze the development areas of the organizations mentioned by the respondents in the survey. As one of the main contributions, this research presents the new Asteri consultative auditing process. Organizations with low performance levels on the audited subject benefit the most from the Asteri consultative auditing process. Asteri may be usable in many different types of audits, not only in SSM audits. As a new result, this study provides new knowledge on attitudes related to auditing. According to the research findings, auditing may generate negative attitudes and the auditor should take them into account when planning and preparing for audits. Negative attitudes can be compensated by producing added value, objectivity and positivity for the audit and, thus, improve the positive effects of auditing on knowledge and skills. Moreover, as the results of this study shows, auditing safety and security issues do not increase feelings of insecurity, but rather increase feelings of safety and security when using the new Asteri consultative auditing process with the TUTOR model. The results showed that the SSM in the audited UASs was statistically significantly more advanced than that in the audited ESs. However, there is still room for improvement in the ESs and the UASs as the approach to the SSM was fragmented. It can be assumed that the majority of Finnish UASs and ESs do not likely meet the basic level of the comprehensive, risk based the SSM.
Resumo:
Celebrity endorsement has increased in popularity over the past decades and companies are willing to spend increasingly excessive amounts of money into it. Even though multiple studies support celebrity endorsement, further research on its impact on advertising effectiveness is called for. Fur-ther, the role of consumers’ product class involvement in advertising needs to be further studied. The purpose of this study is to explore if consumers’ product class involvement and exposure to celebrity endorsers affect consumers brand recall. Supported by earlier studies, brand recall was used as a measure for advertising effectiveness in this study. In general, a psychological approach was chosen for building the theoretical framework. Concept of classical conditioning was presented in order to understand why people act how they do. Balanced theory and meaning transfer model were presented in order to study how celebrities can be used effectively in advertising context. Further, the importance of product class involvement in advertising effectiveness was evaluated. Hypotheses were formulated based on a literature review of the existing research. Because of the versatility of the research design, a mixed methods approach for this study was adopted. Empirical part of the study was conducted in three stages. First, a pre-test was conducted in order to choose suitable product endorsers for the advertisement stimuli used in the experiment. Second, an eye-tracking experiment with 30 test subjects was conducted in order to study how people view advertisements and whether the familiarity of the product endorser and consumers’ product class involvement affects brand recall. For the experiment, a fictional brand was created in order to avoid bias on brand recall. Third, qualitative interviews for 15 test subjects were conducted in the post-experiment stage in order to gain deeper understating of the phenomenon and to make sense of the findings from the experiment. Findings from this study support celebrity endorsement by suggesting that a famous spokesperson does not steal attention from brand information more than a non-celebrity product endorser. As a result, the use of a celebrity endorser did not decrease brand recall. Results support earlier research as consumer’ higher product class involvement resulted in a better brand recall. Findings from the interviews suggest that consumers have positive perceptions of celebrity endorsement in general. However, the celebrity–brand congruence is a crucial factor when creating attitudes towards the advertisement. Future research ideas were presented based on the limitations and results of this study